Seq2seq PytorchSequence to Sequence Models with PyTorch
Stars: ✭ 678 (+716.87%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-72.29%)
Cs224nCS224n: Natural Language Processing with Deep Learning Assignments Winter, 2017
Stars: ✭ 656 (+690.36%)
Skip Thoughts.torchPorting of Skip-Thoughts pretrained models from Theano to PyTorch & Torch7
Stars: ✭ 146 (+75.9%)
GAN-RNN Timeseries-imputationRecurrent GAN for imputation of time series data. Implemented in TensorFlow 2 on Wikipedia Web Traffic Forecast dataset from Kaggle.
Stars: ✭ 107 (+28.92%)
Chinese Chatbot中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (+49.4%)
theano-recurrenceRecurrent Neural Networks (RNN, GRU, LSTM) and their Bidirectional versions (BiRNN, BiGRU, BiLSTM) for word & character level language modelling in Theano
Stars: ✭ 40 (-51.81%)
altairAssessing Source Code Semantic Similarity with Unsupervised Learning
Stars: ✭ 42 (-49.4%)
Mxnet Seq2seqSequence to sequence learning with MXNET
Stars: ✭ 51 (-38.55%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-69.88%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+4018.07%)
Chameleon recsysSource code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+143.37%)
Customer-Feedback-AnalysisMulti Class Text (Feedback) Classification using CNN, GRU Network and pre trained Word2Vec embedding, word embeddings on TensorFlow.
Stars: ✭ 18 (-78.31%)
NLP PEMDCNLP Predtrained Embeddings, Models and Datasets Collections(NLP_PEMDC). The collection will keep updating.
Stars: ✭ 58 (-30.12%)
wmd4jwmd4j is a Java library for calculating Word Mover's Distance (WMD)
Stars: ✭ 31 (-62.65%)
dynmt-pyNeural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-79.52%)
gonnp📉Deep learning from scratch using Go. Specializes in natural language processing
Stars: ✭ 26 (-68.67%)
seq3Source code for the NAACL 2019 paper "SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression"
Stars: ✭ 121 (+45.78%)
FARED for Anomaly DetectionOfficial source code of "Fast Adaptive RNN Encoder-Decoder for Anomaly Detection in SMD Assembly Machine"
Stars: ✭ 14 (-83.13%)
YodaSpeakTranslating English to Yoda English using Sequence-to-Sequence with Tensorflow.
Stars: ✭ 25 (-69.88%)
test word2vec uyghurBu Uyghur yéziqini Pythonning gensim ambiridiki word2vec algorizimida sinap baqqan misal.
Stars: ✭ 15 (-81.93%)
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-69.88%)
navecCompact high quality word embeddings for Russian language
Stars: ✭ 118 (+42.17%)
word2vecRust interface to word2vec.
Stars: ✭ 22 (-73.49%)
cnn-rnn-classifierA practical example on how to combine both a CNN and a RNN to classify images.
Stars: ✭ 47 (-43.37%)
Solar-Rad-ForecastingIn these notebooks the entire research and implementation process carried out for the construction of various machine learning models based on neural networks that are capable of predicting levels of solar radiation is captured given a set of historical data taken by meteorological stations.
Stars: ✭ 24 (-71.08%)
adversarial-code-generationSource code for the ICLR 2021 work "Generating Adversarial Computer Programs using Optimized Obfuscations"
Stars: ✭ 16 (-80.72%)
doc2vec-golangdoc2vec , word2vec, implemented by golang. word embedding representation
Stars: ✭ 33 (-60.24%)
chatbot一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+13.25%)
sentence2vecDeep sentence embedding using Sequence to Sequence learning
Stars: ✭ 23 (-72.29%)
dnn-lstm-word-segmentChinese Word Segmention Base on the Deep Learning and LSTM Neural Network
Stars: ✭ 24 (-71.08%)
Shakespearizing-Modern-EnglishCode for "Jhamtani H.*, Gangal V.*, Hovy E. and Nyberg E. Shakespearizing Modern Language Using Copy-Enriched Sequence to Sequence Models" Workshop on Stylistic Variation, EMNLP 2017
Stars: ✭ 64 (-22.89%)
chatbotkbqa task-oriented qa seq2seq ir neo4j jena seq2seq tf chatbot chat
Stars: ✭ 32 (-61.45%)
reveryA personal semantic search engine capable of surfacing relevant bookmarks, journal entries, notes, blogs, contacts, and more, built on an efficient document embedding algorithm and Monocle's personal search index.
Stars: ✭ 200 (+140.96%)
sp2cpImageboard bot with recurrent neural network (RNN, GRU)
Stars: ✭ 23 (-72.29%)
cnn-rnn-bitcoinReusable CNN and RNN model doing time series binary classification
Stars: ✭ 28 (-66.27%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-66.27%)
solar-forecasting-RNNMulti-time-horizon solar forecasting using recurrent neural network
Stars: ✭ 29 (-65.06%)
ArrayLSTMGPU/CPU (CUDA) Implementation of "Recurrent Memory Array Structures", Simple RNN, LSTM, Array LSTM..
Stars: ✭ 21 (-74.7%)
word-benchmarksBenchmarks for intrinsic word embeddings evaluation.
Stars: ✭ 45 (-45.78%)