dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+40.36%)
Kevinpro-NLP-demoAll NLP you Need Here. 个人实现了一些好玩的NLP demo,目前包含13个NLP应用的pytorch实现
Stars: ✭ 117 (-29.52%)
BossNAS(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-24.7%)
transform-graphql⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-86.14%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-46.39%)
speech-transformerTransformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-75.9%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-65.66%)
GTSRB Keras STNGerman Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (-71.08%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (-57.83%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+22.89%)
ConformerOfficial code for Conformer: Local Features Coupling Global Representations for Visual Recognition
Stars: ✭ 345 (+107.83%)
TransPosePyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+50.6%)
set-transformerA neural network architecture for prediction on sets
Stars: ✭ 18 (-89.16%)
visualizationa collection of visualization function
Stars: ✭ 189 (+13.86%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-86.14%)
OpenPromptAn Open-Source Framework for Prompt-Learning.
Stars: ✭ 1,769 (+965.66%)
TDRGTransformer-based Dual Relation Graph for Multi-label Image Recognition. ICCV 2021
Stars: ✭ 32 (-80.72%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-65.06%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-87.35%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-86.14%)
german-sentimentA data set and model for german sentiment classification.
Stars: ✭ 37 (-77.71%)
Transformer-ocrHandwritten text recognition using transformers.
Stars: ✭ 92 (-44.58%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (-27.11%)
DeepPhonemizerGrapheme to phoneme conversion with deep learning.
Stars: ✭ 152 (-8.43%)
wenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 2,384 (+1336.14%)
towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+394.58%)
MusicTransformer-PytorchMusicTransformer written for MaestroV2 using the Pytorch framework for music generation
Stars: ✭ 106 (-36.14%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+76.51%)
YOLOv5-Lite🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+640.96%)
AdaSpeechAdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (-34.94%)
graphtransRepresenting Long-Range Context for Graph Neural Networks with Global Attention
Stars: ✭ 45 (-72.89%)
Transformer-TransducerPyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (-63.25%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-83.13%)
Vision-Language-TransformerVision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (-23.49%)
ICON(TPAMI2022) Salient Object Detection via Integrity Learning.
Stars: ✭ 125 (-24.7%)
SegFormerOfficial PyTorch implementation of SegFormer
Stars: ✭ 1,264 (+661.45%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-67.47%)
graph-transformer-pytorchImplementation of Graph Transformer in Pytorch, for potential use in replicating Alphafold2
Stars: ✭ 81 (-51.2%)
sticker2Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-91.57%)
GraphormerGraphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material design, drug discovery, etc.
Stars: ✭ 1,194 (+619.28%)
cometaCorpus of Online Medical EnTities: the cometA corpus
Stars: ✭ 31 (-81.33%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-27.11%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-73.49%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (-32.53%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+843.37%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-44.58%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+268.67%)