seq3Source code for the NAACL 2019 paper "SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression"
Stars: ✭ 121 (+236.11%)
Optic-Disc-UnetAttention Unet model with post process for retina optic disc segmention
Stars: ✭ 77 (+113.89%)
Machine-Translation-Hindi-to-english-Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-47.22%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (+52.78%)
CVAE DialCVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-55.56%)
lang2logic-PyTorchPyTorch port of the paper "Language to Logical Form with Neural Attention"
Stars: ✭ 34 (-5.56%)
Seq2Seq-chatbotTensorFlow Implementation of Twitter Chatbot
Stars: ✭ 18 (-50%)
bytenet translationA TensorFlow Implementation of Machine Translation In Neural Machine Translation in Linear Time
Stars: ✭ 60 (+66.67%)
visdialVisual Dialog: Light-weight Transformer for Many Inputs (ECCV 2020)
Stars: ✭ 27 (-25%)
seq2seq-autoencoderTheano implementation of Sequence-to-Sequence Autoencoder
Stars: ✭ 12 (-66.67%)
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+202.78%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+211.11%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+38.89%)
SiGATsource code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (+2.78%)
Multi-task-Conditional-Attention-NetworksA prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (-41.67%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+1213.89%)
question-generationNeural Models for Key Phrase Detection and Question Generation
Stars: ✭ 29 (-19.44%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (+197.22%)
vat nmtImplementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019
Stars: ✭ 22 (-38.89%)
chatbot一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+161.11%)
axial-attentionImplementation of Axial attention - attending to multi-dimensional data efficiently
Stars: ✭ 245 (+580.56%)
DARNNA Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
Stars: ✭ 90 (+150%)
lstm-attentionAttention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+141.67%)
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-19.44%)
SRBCode for "Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization"
Stars: ✭ 41 (+13.89%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-44.44%)
sentence2vecDeep sentence embedding using Sequence to Sequence learning
Stars: ✭ 23 (-36.11%)
stanford-cs231n-assignments-2020This repository contains my solutions to the assignments for Stanford's CS231n "Convolutional Neural Networks for Visual Recognition" (Spring 2020).
Stars: ✭ 84 (+133.33%)
sentencepiece-jniJava JNI wrapper for SentencePiece: unsupervised text tokenizer for Neural Network-based text generation.
Stars: ✭ 26 (-27.78%)
sktSanskrit compound segmentation using seq2seq model
Stars: ✭ 21 (-41.67%)
chatbotSeq2Seq Chatbot with attention mechanism
Stars: ✭ 19 (-47.22%)
rnn-text-classification-tfTensorflow implementation of Attention-based Bidirectional RNN text classification.
Stars: ✭ 26 (-27.78%)
Speech recognition with tensorflowImplementation of a seq2seq model for Speech Recognition using the latest version of TensorFlow. Architecture similar to Listen, Attend and Spell.
Stars: ✭ 253 (+602.78%)
Mead BaselineDeep-Learning Model Exploration and Development for NLP
Stars: ✭ 238 (+561.11%)
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+186.11%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (+58.33%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+1166.67%)
Debug seq2seq[unmaintained] Make seq2seq for keras work
Stars: ✭ 233 (+547.22%)
Text summarization with tensorflowImplementation of a seq2seq model for summarization of textual data. Demonstrated on amazon reviews, github issues and news articles.
Stars: ✭ 226 (+527.78%)
Nlp Tools😋本项目旨在通过Tensorflow基于BiLSTM+CRF实现中文分词、词性标注、命名实体识别(NER)。
Stars: ✭ 225 (+525%)
PaddlenlpNLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (+488.89%)
MT-PreparationMachine Translation (MT) Preparation Scripts
Stars: ✭ 15 (-58.33%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+263.89%)
Headliner🏖 Easy training and deployment of seq2seq models.
Stars: ✭ 221 (+513.89%)
convolutional seq2seqfairseq: Convolutional Sequence to Sequence Learning (Gehring et al. 2017) by Chainer
Stars: ✭ 63 (+75%)
Screenshot To CodeA neural network that transforms a design mock-up into a static website.
Stars: ✭ 13,561 (+37569.44%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (+236.11%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-38.89%)