trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-77.78%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+44139.68%)
transformerBuild English-Vietnamese machine translation with ProtonX Transformer. :D
Stars: ✭ 41 (-67.46%)
BertvizTool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+2632.54%)
sb-nmtCode for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
Stars: ✭ 66 (-47.62%)
ru-dalleGenerate images from texts. In Russian
Stars: ✭ 1,606 (+1174.6%)
pytorch-lr-schedulerPyTorch implementation of some learning rate schedulers for deep learning researcher.
Stars: ✭ 65 (-48.41%)
sisterSImple SenTence EmbeddeR
Stars: ✭ 66 (-47.62%)
TorchnlpEasy to use NLP library built on PyTorch and TorchText
Stars: ✭ 233 (+84.92%)
t5-japaneseCodes to pre-train Japanese T5 models
Stars: ✭ 39 (-69.05%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+65.87%)
SegSwap(CVPRW 2022) Learning Co-segmentation by Segment Swapping for Retrieval and Discovery
Stars: ✭ 46 (-63.49%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-67.46%)
TokenLabelingPytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"
Stars: ✭ 385 (+205.56%)
SSE-PTCodes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
Stars: ✭ 103 (-18.25%)
ViTs-vs-CNNs[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (+15.08%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+2612.7%)
transformer-lsOfficial PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (+59.52%)
Gpt2 NewstitleChinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (+86.51%)
kaggle-champsCode for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (-61.11%)
YinThe efficient and elegant JSON:API 1.1 server library for PHP
Stars: ✭ 214 (+69.84%)
Hardware Aware Transformers[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
Stars: ✭ 206 (+63.49%)
densecapDense video captioning in PyTorch
Stars: ✭ 37 (-70.63%)
Cross-lingual-SummarizationZero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention
Stars: ✭ 28 (-77.78%)
BMTSource code for "Bi-modal Transformer for Dense Video Captioning" (BMVC 2020)
Stars: ✭ 192 (+52.38%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-15.08%)
php-serializerSerialize PHP variables, including objects, in any format. Support to unserialize it too.
Stars: ✭ 47 (-62.7%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (+53.97%)
VideoTransformer-pytorchPyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (+26.19%)
nested-transformerNested Hierarchical Transformer https://arxiv.org/pdf/2105.12723.pdf
Stars: ✭ 174 (+38.1%)
fastT5⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+234.13%)
VT-UNet[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (+19.84%)
sparql-transformerA more handy way to use SPARQL data in your web app
Stars: ✭ 38 (-69.84%)
Ner Bert PytorchPyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (+97.62%)
R-MeNTransformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (-41.27%)
InsightRepository for Project Insight: NLP as a Service
Stars: ✭ 246 (+95.24%)
Relational Rnn PytorchAn implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (+87.3%)
PosthtmlPostHTML is a tool to transform HTML/XML with JS plugins
Stars: ✭ 2,737 (+2072.22%)
TransBTSThis repo provides the official code for : 1) TransBTS: Multimodal Brain Tumor Segmentation Using Transformer (https://arxiv.org/abs/2103.04430) , accepted by MICCAI2021. 2) TransBTSV2: Towards Better and More Efficient Volumetric Segmentation of Medical Images(https://arxiv.org/abs/2201.12785).
Stars: ✭ 254 (+101.59%)
bytekitJava 字节操作的工具库(不是字节码的工具库)
Stars: ✭ 40 (-68.25%)
Multigraph transformer transformer, multi-graph transformer, graph, graph classification, sketch recognition, sketch classification, free-hand sketch, official code of the paper "Multi-Graph Transformer for Free-Hand Sketch Recognition"
Stars: ✭ 231 (+83.33%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-56.35%)
PaddlenlpNLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (+68.25%)
MASTER-pytorchCode for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (+108.73%)
Sttn[ECCV'2020] STTN: Learning Joint Spatial-Temporal Transformations for Video Inpainting
Stars: ✭ 211 (+67.46%)
capeContinuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Stars: ✭ 29 (-76.98%)
Bert ChainerChainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
Stars: ✭ 205 (+62.7%)
query-selectorLONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (-50%)
text simplificationText Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (-47.62%)
Neural-Scam-ArtistWeb Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-85.71%)