All Projects → Hatn → Similar Projects or Alternatives

293 Open source projects that are alternatives of or similar to Hatn

MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-39.73%)
Mutual labels:  attention, domain-adaptation
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-54.79%)
Mutual labels:  attention
Adaptsegnet
Learning to Adapt Structured Output Space for Semantic Segmentation, CVPR 2018 (spotlight)
Stars: ✭ 654 (+795.89%)
Mutual labels:  domain-adaptation
Residual Attention Network
Residual Attention Network for Image Classification
Stars: ✭ 525 (+619.18%)
Mutual labels:  attention
Tf Rnn Attention
Tensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+906.85%)
Mutual labels:  attention
Biblosa Pytorch
Re-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-41.1%)
Mutual labels:  attention
Attention Is All You Need Pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Stars: ✭ 6,070 (+8215.07%)
Mutual labels:  attention
Yolov4 Pytorch
This is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Stars: ✭ 1,070 (+1365.75%)
Mutual labels:  attention
Banglatranslator
Bangla Machine Translator
Stars: ✭ 21 (-71.23%)
Mutual labels:  attention
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+528.77%)
Mutual labels:  attention
Recurrent Visual Attention
A PyTorch Implementation of "Recurrent Models of Visual Attention"
Stars: ✭ 414 (+467.12%)
Mutual labels:  attention
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+1143.84%)
Mutual labels:  attention
Gvb
Code of Gradually Vanishing Bridge for Adversarial Domain Adaptation (CVPR2020)
Stars: ✭ 52 (-28.77%)
Mutual labels:  domain-adaptation
Text Classification
Implementation of papers for text classification task on DBpedia
Stars: ✭ 682 (+834.25%)
Mutual labels:  attention
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-12.33%)
Mutual labels:  attention
Vad
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+752.05%)
Mutual labels:  attention
Self Supervised Da
self-supervised domain adaptation
Stars: ✭ 36 (-50.68%)
Mutual labels:  domain-adaptation
Tf Dann
Domain-Adversarial Neural Network in Tensorflow
Stars: ✭ 556 (+661.64%)
Mutual labels:  domain-adaptation
Deep Transfer Learning
Deep Transfer Learning Papers
Stars: ✭ 68 (-6.85%)
Mutual labels:  domain-adaptation
Chinesenre
中文实体关系抽取,pytorch,bilstm+attention
Stars: ✭ 463 (+534.25%)
Mutual labels:  attention
Generalizing Reid
Repository of the paper "Generalizing Person Re-Identification by Camera-Aware Instance Learning and Cross-Domain Mixup"
Stars: ✭ 28 (-61.64%)
Mutual labels:  domain-adaptation
Mac Network
Implementation for the paper "Compositional Attention Networks for Machine Reasoning" (Hudson and Manning, ICLR 2018)
Stars: ✭ 444 (+508.22%)
Mutual labels:  attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (-27.4%)
Mutual labels:  attention
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+11517.81%)
Mutual labels:  domain-adaptation
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+458.9%)
Mutual labels:  attention
Dann
pytorch implementation of Domain-Adversarial Training of Neural Networks
Stars: ✭ 400 (+447.95%)
Mutual labels:  domain-adaptation
Pot
POT : Python Optimal Transport
Stars: ✭ 929 (+1172.6%)
Mutual labels:  domain-adaptation
Meta Learning Bert
Meta learning with BERT as a learner
Stars: ✭ 52 (-28.77%)
Mutual labels:  domain-adaptation
Spatial Transformer Network
A Tensorflow implementation of Spatial Transformer Networks.
Stars: ✭ 794 (+987.67%)
Mutual labels:  attention
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-12.33%)
Mutual labels:  attention
Nlp paper study
研读顶会论文,复现论文相关代码
Stars: ✭ 691 (+846.58%)
Mutual labels:  attention
Sentences pair similarity calculation siamese lstm
A Keras Implementation of Attention_based Siamese Manhattan LSTM
Stars: ✭ 48 (-34.25%)
Mutual labels:  attention
Transfer Learning Library
Transfer-Learning-Library
Stars: ✭ 678 (+828.77%)
Mutual labels:  domain-adaptation
Enjoy Hamburger
[ICLR 2021] Is Attention Better Than Matrix Decomposition?
Stars: ✭ 69 (-5.48%)
Mutual labels:  attention
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+758.9%)
Mutual labels:  attention
Attentions
PyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-46.58%)
Mutual labels:  attention
Simplecvreproduction
Reproduce simple cv project including attention module, classification, object detection, segmentation, keypoint detection, tracking 😄 etc.
Stars: ✭ 602 (+724.66%)
Mutual labels:  attention
Attention Over Attention Tf Qa
论文“Attention-over-Attention Neural Networks for Reading Comprehension”中AoA模型实现
Stars: ✭ 58 (-20.55%)
Mutual labels:  attention
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+673.97%)
Mutual labels:  attention
Attentioncluster
TensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-54.79%)
Mutual labels:  attention
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+647.95%)
Mutual labels:  attention
Man
Multinomial Adversarial Networks for Multi-Domain Text Classification (NAACL 2018)
Stars: ✭ 72 (-1.37%)
Mutual labels:  domain-adaptation
Punctuator2
A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
Stars: ✭ 483 (+561.64%)
Mutual labels:  attention
Defactonlp
DeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-58.9%)
Mutual labels:  attention
Rnn Nlu
A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling
Stars: ✭ 463 (+534.25%)
Mutual labels:  attention
Fluence
A deep learning library based on Pytorch focussed on low resource language research and robustness
Stars: ✭ 54 (-26.03%)
Mutual labels:  attention
Ban Vqa
Bilinear attention networks for visual question answering
Stars: ✭ 449 (+515.07%)
Mutual labels:  attention
Domainadaptation
Repository for the article "Unsupervised domain adaptation for medical imaging segmentation with self-ensembling".
Stars: ✭ 27 (-63.01%)
Mutual labels:  domain-adaptation
Gansformer
Generative Adversarial Transformers
Stars: ✭ 421 (+476.71%)
Mutual labels:  attention
Cross Domain ner
Cross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (-8.22%)
Mutual labels:  domain-adaptation
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+463.01%)
Mutual labels:  attention
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-71.23%)
Mutual labels:  attention
Deep learning nlp
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+457.53%)
Mutual labels:  attention
Text Classification Keras
📚 Text classification library with Keras
Stars: ✭ 53 (-27.4%)
Mutual labels:  attention
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+439.73%)
Mutual labels:  attention
Nlp tensorflow project
Use tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-63.01%)
Mutual labels:  attention
Absa Pytorch
Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Stars: ✭ 1,181 (+1517.81%)
Mutual labels:  attention
Libtlda
Library of transfer learners and domain-adaptive classifiers.
Stars: ✭ 71 (-2.74%)
Mutual labels:  domain-adaptation
Scl
Implementation of "SCL: Towards Accurate Domain Adaptive Object Detection via Gradient Detach Based Stacked Complementary Losses"
Stars: ✭ 65 (-10.96%)
Mutual labels:  domain-adaptation
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-28.77%)
Mutual labels:  attention
1-60 of 293 similar projects