Odsc 2020 nlpRepository for ODSC talk related to Deep Learning NLP
Stars: ✭ 20 (-94.81%)
GTSRB Keras STNGerman Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (-87.53%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (-49.61%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-94.55%)
Bert KerasKeras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+112.99%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+58.96%)
Rasa chatbot cnbuilding a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
Stars: ✭ 723 (+87.79%)
TransBTSThis repo provides the official code for : 1) TransBTS: Multimodal Brain Tumor Segmentation Using Transformer (https://arxiv.org/abs/2103.04430) , accepted by MICCAI2021. 2) TransBTSV2: Towards Better and More Efficient Volumetric Segmentation of Medical Images(https://arxiv.org/abs/2201.12785).
Stars: ✭ 254 (-34.03%)
Easyflipviewpager📖 The library for creating book and card flip animations in ViewPager in Android
Stars: ✭ 698 (+81.3%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (-39.48%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-57.14%)
ConformerOfficial code for Conformer: Local Features Coupling Global Representations for Visual Recognition
Stars: ✭ 345 (-10.39%)
Laravel ResponderA Laravel Fractal package for building API responses, giving you the power of Fractal with Laravel's elegancy.
Stars: ✭ 673 (+74.81%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-76.88%)
frc-score-detectionA program to detect FRC match scores from their livestream.
Stars: ✭ 15 (-96.1%)
set-transformerA neural network architecture for prediction on sets
Stars: ✭ 18 (-95.32%)
Kevinpro-NLP-demoAll NLP you Need Here. 个人实现了一些好玩的NLP demo,目前包含13个NLP应用的pytorch实现
Stars: ✭ 117 (-69.61%)
Medical TransformerPytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-60.26%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-94.03%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-85.19%)
TDRGTransformer-based Dual Relation Graph for Multi-label Image Recognition. ICCV 2021
Stars: ✭ 32 (-91.69%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+47.27%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-84.94%)
Bert paper chinese translationBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (+46.49%)
Grocery-Product-DetectionThis repository builds a product detection model to recognize products from grocery shelf images.
Stars: ✭ 73 (-81.04%)
php-json-apiJSON API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 68 (-82.34%)
Athenaan open-source implementation of sequence-to-sequence based speech processing engine
Stars: ✭ 542 (+40.78%)
FasterTransformerTransformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+308.05%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-87.27%)
FormerSimple transformer implementation from scratch in pytorch.
Stars: ✭ 500 (+29.87%)
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-92.47%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+30.13%)
ClusterTransformerTopic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Stars: ✭ 36 (-90.65%)
The Story Of HeadsThis is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the paper "Analyzing Source and Target Contributions to NMT Predictions".
Stars: ✭ 146 (-62.08%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-65.97%)
SSE-PTCodes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
Stars: ✭ 103 (-73.25%)
OmninetOfficial Pytorch implementation of "OmniNet: A unified architecture for multi-modal multi-task learning" | Authors: Subhojeet Pramanik, Priyanka Agrawal, Aman Hussain
Stars: ✭ 448 (+16.36%)
py-faster-rcnn-imagenetTrain faster rcnn on imagine dataset, related blog post: https://andrewliao11.github.io/object/detection/2016/07/23/detection/
Stars: ✭ 133 (-65.45%)
sisterSImple SenTence EmbeddeR
Stars: ✭ 66 (-82.86%)
libaiLiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (-26.23%)
R-MeNTransformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (-80.78%)
head-network-distillation[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (-92.99%)
Sttn[ECCV'2020] STTN: Learning Joint Spatial-Temporal Transformations for Video Inpainting
Stars: ✭ 211 (-45.19%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-81.04%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-78.18%)