MGANExploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-39.73%)
Attentive Neural Processesimplementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-54.79%)
AdaptsegnetLearning to Adapt Structured Output Space for Semantic Segmentation, CVPR 2018 (spotlight)
Stars: ✭ 654 (+795.89%)
Tf Rnn AttentionTensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+906.85%)
Biblosa PytorchRe-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-41.1%)
Yolov4 PytorchThis is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Stars: ✭ 1,070 (+1365.75%)
Pytorch GatMy implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+1143.84%)
GvbCode of Gradually Vanishing Bridge for Adversarial Domain Adaptation (CVPR2020)
Stars: ✭ 52 (-28.77%)
Text ClassificationImplementation of papers for text classification task on DBpedia
Stars: ✭ 682 (+834.25%)
Global Self Attention NetworkA Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-12.33%)
VadVoice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+752.05%)
Tf DannDomain-Adversarial Neural Network in Tensorflow
Stars: ✭ 556 (+661.64%)
Chinesenre中文实体关系抽取,pytorch,bilstm+attention
Stars: ✭ 463 (+534.25%)
Generalizing ReidRepository of the paper "Generalizing Person Re-Identification by Camera-Aware Instance Learning and Cross-Domain Mixup"
Stars: ✭ 28 (-61.64%)
Mac NetworkImplementation for the paper "Compositional Attention Networks for Machine Reasoning" (Hudson and Manning, ICLR 2018)
Stars: ✭ 444 (+508.22%)
TransferlearningTransfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+11517.81%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+458.9%)
Dannpytorch implementation of Domain-Adversarial Training of Neural Networks
Stars: ✭ 400 (+447.95%)
PotPOT : Python Optimal Transport
Stars: ✭ 929 (+1172.6%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-12.33%)
Enjoy Hamburger[ICLR 2021] Is Attention Better Than Matrix Decomposition?
Stars: ✭ 69 (-5.48%)
AttentionsPyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-46.58%)
SimplecvreproductionReproduce simple cv project including attention module, classification, object detection, segmentation, keypoint detection, tracking 😄 etc.
Stars: ✭ 602 (+724.66%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+673.97%)
AttentionclusterTensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-54.79%)
Performer PytorchAn implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+647.95%)
ManMultinomial Adversarial Networks for Multi-Domain Text Classification (NAACL 2018)
Stars: ✭ 72 (-1.37%)
Punctuator2A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
Stars: ✭ 483 (+561.64%)
DefactonlpDeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-58.9%)
Rnn NluA TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling
Stars: ✭ 463 (+534.25%)
FluenceA deep learning library based on Pytorch focussed on low resource language research and robustness
Stars: ✭ 54 (-26.03%)
Ban VqaBilinear attention networks for visual question answering
Stars: ✭ 449 (+515.07%)
DomainadaptationRepository for the article "Unsupervised domain adaptation for medical imaging segmentation with self-ensembling".
Stars: ✭ 27 (-63.01%)
GansformerGenerative Adversarial Transformers
Stars: ✭ 421 (+476.71%)
Cross Domain nerCross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (-8.22%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+463.01%)
Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-71.23%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+457.53%)
Nlp TutorialsSimple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+439.73%)
Nlp tensorflow projectUse tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-63.01%)
Absa PytorchAspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Stars: ✭ 1,181 (+1517.81%)
LibtldaLibrary of transfer learners and domain-adaptive classifiers.
Stars: ✭ 71 (-2.74%)
SclImplementation of "SCL: Towards Accurate Domain Adaptive Object Detection via Gradient Detach Based Stacked Complementary Losses"
Stars: ✭ 65 (-10.96%)
Time AttentionImplementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-28.77%)