nested-transformerNested Hierarchical Transformer https://arxiv.org/pdf/2105.12723.pdf
Stars: ✭ 174 (-54.81%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (-47.01%)
HRFormerThis is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (-7.27%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (-81.82%)
visualizationa collection of visualization function
Stars: ✭ 189 (-50.91%)
GhostnetCV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+352.99%)
transformerBuild English-Vietnamese machine translation with ProtonX Transformer. :D
Stars: ✭ 41 (-89.35%)
Grocery-Product-DetectionThis repository builds a product detection model to recognize products from grocery shelf images.
Stars: ✭ 73 (-81.04%)
php-serializerSerialize PHP variables, including objects, in any format. Support to unserialize it too.
Stars: ✭ 47 (-87.79%)
stereo.visionplanar fitting computation using stereo vision techniques
Stars: ✭ 19 (-95.06%)
Learnable-Image-ResizingTF 2 implementation Learning to Resize Images for Computer Vision Tasks (https://arxiv.org/abs/2103.09950v1).
Stars: ✭ 48 (-87.53%)
BMTSource code for "Bi-modal Transformer for Dense Video Captioning" (BMVC 2020)
Stars: ✭ 192 (-50.13%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-89.35%)
BAKESelf-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Stars: ✭ 79 (-79.48%)
alexnetcustom implementation alexnet with tensorflow
Stars: ✭ 21 (-94.55%)
bytekitJava 字节操作的工具库(不是字节码的工具库)
Stars: ✭ 40 (-89.61%)
VT-UNet[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (-60.78%)
kaggle-champsCode for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (-87.27%)
Ner Bert PytorchPyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (-35.32%)
sam-textvqaOfficial code for paper "Spatially Aware Multimodal Transformers for TextVQA" published at ECCV, 2020.
Stars: ✭ 51 (-86.75%)
InsightRepository for Project Insight: NLP as a Service
Stars: ✭ 246 (-36.1%)
Relational Rnn PytorchAn implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (-38.7%)
fastT5⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+9.35%)
query-selectorLONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (-83.64%)
PosthtmlPostHTML is a tool to transform HTML/XML with JS plugins
Stars: ✭ 2,737 (+610.91%)
SegSwap(CVPRW 2022) Learning Co-segmentation by Segment Swapping for Retrieval and Discovery
Stars: ✭ 46 (-88.05%)
cozmo-tensorflow🤖 Cozmo the Robot recognizes objects with TensorFlow
Stars: ✭ 61 (-84.16%)
sb-nmtCode for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
Stars: ✭ 66 (-82.86%)
Cross-lingual-SummarizationZero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention
Stars: ✭ 28 (-92.73%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-72.21%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (-49.61%)
TransBTSThis repo provides the official code for : 1) TransBTS: Multimodal Brain Tumor Segmentation Using Transformer (https://arxiv.org/abs/2103.04430) , accepted by MICCAI2021. 2) TransBTSV2: Towards Better and More Efficient Volumetric Segmentation of Medical Images(https://arxiv.org/abs/2201.12785).
Stars: ✭ 254 (-34.03%)
frc-score-detectionA program to detect FRC match scores from their livestream.
Stars: ✭ 15 (-96.1%)
SSE-PTCodes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
Stars: ✭ 103 (-73.25%)
t5-japaneseCodes to pre-train Japanese T5 models
Stars: ✭ 39 (-89.87%)
pytorch-lr-schedulerPyTorch implementation of some learning rate schedulers for deep learning researcher.
Stars: ✭ 65 (-83.12%)
sparql-transformerA more handy way to use SPARQL data in your web app
Stars: ✭ 38 (-90.13%)
EffcientNetV2EfficientNetV2 implementation using PyTorch
Stars: ✭ 94 (-75.58%)
MASTER-pytorchCode for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (-31.69%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+787.79%)
ru-dalleGenerate images from texts. In Russian
Stars: ✭ 1,606 (+317.14%)
BertvizTool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+794.29%)
Gpt2 NewstitleChinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (-38.96%)
capeContinuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Stars: ✭ 29 (-92.47%)
TorchnlpEasy to use NLP library built on PyTorch and TorchText
Stars: ✭ 233 (-39.48%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-94.29%)
ViTs-vs-CNNs[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (-62.34%)
pybvA lightweight I/O utility for the BrainVision data format, written in Python.
Stars: ✭ 18 (-95.32%)
Multigraph transformer transformer, multi-graph transformer, graph, graph classification, sketch recognition, sketch classification, free-hand sketch, official code of the paper "Multi-Graph Transformer for Free-Hand Sketch Recognition"
Stars: ✭ 231 (-40%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-45.71%)
densecapDense video captioning in PyTorch
Stars: ✭ 37 (-90.39%)