PosthtmlPostHTML is a tool to transform HTML/XML with JS plugins
Stars: ✭ 2,737 (+9023.33%)
php-serializerSerialize PHP variables, including objects, in any format. Support to unserialize it too.
Stars: ✭ 47 (+56.67%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+580%)
CSV2RDFStreaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (+60%)
Walk-TransformerFrom Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-13.33%)
Transformer-ocrHandwritten text recognition using transformers.
Stars: ✭ 92 (+206.67%)
HRFormerThis is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+1090%)
transform-graphql⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-23.33%)
BossNAS(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (+316.67%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (+76.67%)
speech-transformerTransformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (+33.33%)
Learning-Lab-C-LibraryThis library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-33.33%)
towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+2636.67%)
TransPosePyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+733.33%)
golgothaContextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (+30%)
OpenPromptAn Open-Source Framework for Prompt-Learning.
Stars: ✭ 1,769 (+5796.67%)
Vision-Language-TransformerVision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (+323.33%)
GTSRB Keras STNGerman Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (+60%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-30%)
bxcanbxCAN peripheral driver for STM32 chips
Stars: ✭ 22 (-26.67%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+1940%)
RSTNetRSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (+136.67%)
DeepPhonemizerGrapheme to phoneme conversion with deep learning.
Stars: ✭ 152 (+406.67%)
Restormer[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+1853.33%)
graphtransRepresenting Long-Range Context for Graph Neural Networks with Global Attention
Stars: ✭ 45 (+50%)
deformer[ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
Stars: ✭ 111 (+270%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (+80%)
TadTREnd-to-end Temporal Action Detection with Transformer. [Under review for a journal publication]
Stars: ✭ 55 (+83.33%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (+46.67%)
zend-expressive-halHypertext Application Language implementation for PHP and PSR-7
Stars: ✭ 37 (+23.33%)
VectorDrawable2SvgConverts Android VectorDrawable .xml files to .svg files
Stars: ✭ 50 (+66.67%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (+133.33%)
MinTLMinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
Stars: ✭ 61 (+103.33%)
gsmgsm module library for STM32 LL
Stars: ✭ 28 (-6.67%)
YOLOv5-Lite🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+4000%)
consoleHAL management console
Stars: ✭ 41 (+36.67%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-6.67%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+346.67%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+5120%)
graph-transformer-pytorchImplementation of Graph Transformer in Pytorch, for potential use in replicating Alphafold2
Stars: ✭ 81 (+170%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-16.67%)
cometaCorpus of Online Medical EnTities: the cometA corpus
Stars: ✭ 31 (+3.33%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (+206.67%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+273.33%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+386.67%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-23.33%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+876.67%)
M3DETRCode base for M3DeTR: Multi-representation, Multi-scale, Mutual-relation 3D Object Detection with Transformers
Stars: ✭ 47 (+56.67%)
odinData-structure definition/validation/traversal, mapping and serialisation toolkit for Python
Stars: ✭ 24 (-20%)