Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+547.62%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+406.35%)
visualizationa collection of visualization function
Stars: ✭ 189 (+200%)
SightseqComputer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (+84.13%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+552.38%)
Transformers.jlJulia Implementation of Transformer models
Stars: ✭ 173 (+174.6%)
Chinesenre中文实体关系抽取,pytorch,bilstm+attention
Stars: ✭ 463 (+634.92%)
Nlp TutorialNatural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+15606.35%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-22.22%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-34.92%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+796.83%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+231.75%)
GraphtransformerGraph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+196.83%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+53.97%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-58.73%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-55.56%)
Distre[ACL 19] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
Stars: ✭ 75 (+19.05%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+5325.4%)
Nlp TutorialsSimple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+525.4%)
Tre[AKBC 19] Improving Relation Extraction by Pre-trained Language Representations
Stars: ✭ 95 (+50.79%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (+1.59%)
Medical TransformerPytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (+142.86%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+92.06%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-9.52%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-68.25%)
VERSEVancouver Event and Relation System for Extraction
Stars: ✭ 13 (-79.37%)
GTSRB Keras STNGerman Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (-23.81%)
cometaCorpus of Online Medical EnTities: the cometA corpus
Stars: ✭ 31 (-50.79%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-14.29%)
InformationExtractionSystemInformation Extraction System can perform NLP tasks like Named Entity Recognition, Sentence Simplification, Relation Extraction etc.
Stars: ✭ 27 (-57.14%)
staginSTAGIN: Spatio-Temporal Attention Graph Isomorphism Network
Stars: ✭ 34 (-46.03%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-66.67%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+77.78%)
PathNRESource code and dataset of EMNLP2017 paper "Incorporating Relation Paths in Neural Relation Extraction".
Stars: ✭ 42 (-33.33%)
LFattNetAttention-based View Selection Networks for Light-field Disparity Estimation
Stars: ✭ 41 (-34.92%)
LNSwipeCell一套友好的、方便集成的针对cell的左滑编辑功能!
Stars: ✭ 16 (-74.6%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+871.43%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-63.49%)
towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+1203.17%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (+92.06%)
transform-graphql⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-63.49%)
Shukongdashi使用知识图谱,自然语言处理,卷积神经网络等技术,基于python语言,设计了一个数控领域故障诊断专家系统
Stars: ✭ 109 (+73.02%)
knodleA PyTorch-based open-source framework that provides methods for improving the weakly annotated data and allows researchers to efficiently develop and compare their own methods.
Stars: ✭ 76 (+20.63%)
jeelizGlanceTrackerJavaScript/WebGL lib: detect if the user is looking at the screen or not from the webcam video feed. Lightweight and robust to all lighting conditions. Great for play/pause videos if the user is looking or not, or for person detection. Link to live demo.
Stars: ✭ 68 (+7.94%)