DeepspeedDeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Stars: ✭ 6,024 (+2021.13%)
VT-UNet[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (-46.83%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-92.61%)
towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+189.08%)
BossNAS(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-55.99%)
SwinIRSwinIR: Image Restoration Using Swin Transformer (official repository)
Stars: ✭ 1,260 (+343.66%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+451.41%)
transformer-lsOfficial PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (-29.23%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (-75.35%)
PASSLPASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,BEiT,MAE等图像自监督算法以及 Vision Transformer,DEiT,Swin Transformer,CvT,T2T-ViT,MLP-Mixer,XCiT,ConvNeXt,PVTv2 等基础视觉算法
Stars: ✭ 134 (-52.82%)
PLSCPaddle Large Scale Classification Tools,supports ArcFace, CosFace, PartialFC, Data Parallel + Model Parallel. Model includes ResNet, ViT, DeiT, FaceViT.
Stars: ✭ 113 (-60.21%)
visualizationa collection of visualization function
Stars: ✭ 189 (-33.45%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+115.49%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-91.9%)
GhostnetCV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+514.08%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-85.56%)
t5-japaneseCodes to pre-train Japanese T5 models
Stars: ✭ 39 (-86.27%)
transformerBuild English-Vietnamese machine translation with ProtonX Transformer. :D
Stars: ✭ 41 (-85.56%)
SimMIMThis is an official implementation for "SimMIM: A Simple Framework for Masked Image Modeling".
Stars: ✭ 717 (+152.46%)
fastT5⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+48.24%)
pytorch-cifar-model-zooImplementation of Conv-based and Vit-based networks designed for CIFAR.
Stars: ✭ 62 (-78.17%)
form2fit[ICRA 2020] Train generalizable policies for kit assembly with self-supervised dense correspondence learning.
Stars: ✭ 78 (-72.54%)
SSE-PTCodes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
Stars: ✭ 103 (-63.73%)
Fengshenbang-LMFengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Stars: ✭ 1,813 (+538.38%)
bytekitJava 字节操作的工具库(不是字节码的工具库)
Stars: ✭ 40 (-85.92%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (-31.69%)
ru-dalleGenerate images from texts. In Russian
Stars: ✭ 1,606 (+465.49%)
SSTDA[CVPR 2020] Action Segmentation with Joint Self-Supervised Temporal Domain Adaptation (PyTorch)
Stars: ✭ 150 (-47.18%)
object-aware-contrastiveObject-aware Contrastive Learning for Debiased Scene Representation (NeurIPS 2021)
Stars: ✭ 44 (-84.51%)
nested-transformerNested Hierarchical Transformer https://arxiv.org/pdf/2105.12723.pdf
Stars: ✭ 174 (-38.73%)
Revisiting-Contrastive-SSLRevisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-71.48%)
CLSAofficial implemntation for "Contrastive Learning with Stronger Augmentations"
Stars: ✭ 48 (-83.1%)
temporal-sslVideo Representation Learning by Recognizing Temporal Transformations. In ECCV, 2020.
Stars: ✭ 46 (-83.8%)
SCL📄 Spatial Contrastive Learning for Few-Shot Classification (ECML/PKDD 2021).
Stars: ✭ 42 (-85.21%)
pytorch-lr-schedulerPyTorch implementation of some learning rate schedulers for deep learning researcher.
Stars: ✭ 65 (-77.11%)
ViTs-vs-CNNs[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (-48.94%)
Ner Bert PytorchPyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (-12.32%)
MASTER-pytorchCode for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (-7.39%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+1103.52%)
InsightRepository for Project Insight: NLP as a Service
Stars: ✭ 246 (-13.38%)
horovod-ansibleCreate Horovod cluster easily using Ansible
Stars: ✭ 22 (-92.25%)
R-MeNTransformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (-73.94%)
BertvizTool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+1112.32%)
Relational Rnn PytorchAn implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (-16.9%)
query-selectorLONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (-77.82%)
Gpt2 NewstitleChinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (-17.25%)
PosthtmlPostHTML is a tool to transform HTML/XML with JS plugins
Stars: ✭ 2,737 (+863.73%)
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-73.24%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-92.25%)
TorchnlpEasy to use NLP library built on PyTorch and TorchText
Stars: ✭ 233 (-17.96%)
densecapDense video captioning in PyTorch
Stars: ✭ 37 (-86.97%)
Multigraph transformer transformer, multi-graph transformer, graph, graph classification, sketch recognition, sketch classification, free-hand sketch, official code of the paper "Multi-Graph Transformer for Free-Hand Sketch Recognition"
Stars: ✭ 231 (-18.66%)