Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (+120.69%)
Mutual labels: word2vec, transformer, attention
Numpy MlMachine learning, in numpy
Stars: ✭ 11,100 (+38175.86%)
Mutual labels: word2vec, lstm, attention
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+334.48%)
Mutual labels: lstm, attention, attention-model
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+11686.21%)
Mutual labels: transformer, lstm, attention
Nlp JourneyDocuments, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+4348.28%)
Mutual labels: word2vec, attention
Neural NetworksAll about Neural Networks!
Stars: ✭ 34 (+17.24%)
Mutual labels: word2vec, lstm
Repo 2016R, Python and Mathematica Codes in Machine Learning, Deep Learning, Artificial Intelligence, NLP and Geolocation
Stars: ✭ 103 (+255.17%)
Mutual labels: word2vec, lstm
Nlp researchNLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (+386.21%)
Mutual labels: word2vec, transformer
Embedding As ServiceOne-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (+420.69%)
Mutual labels: word2vec, transformer
Chameleon recsysSource code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+596.55%)
Mutual labels: word2vec, lstm
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (+17.24%)
Mutual labels: attention, attention-model
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+1303.45%)
Mutual labels: word2vec, attention
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (+41.38%)
Mutual labels: transformer, attention
EBIM-NLIEnhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-17.24%)
Mutual labels: lstm, attention
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+317.24%)
Mutual labels: transformer, attention
AttentionwalkA PyTorch Implementation of "Watch Your Step: Learning Node Embeddings via Graph Attention" (NeurIPS 2018).
Stars: ✭ 266 (+817.24%)
Mutual labels: word2vec, attention
game2vecTensorFlow implementation of word2vec applied on https://www.kaggle.com/tamber/steam-video-games dataset, using both CBOW and Skip-gram.
Stars: ✭ 62 (+113.79%)
Mutual labels: word2vec, cbow
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (+0%)
Mutual labels: transformer, seq2seq-model
Persian-Sentiment-AnalyzerPersian sentiment analysis ( آناکاوی سهش های فارسی | تحلیل احساسات فارسی )
Stars: ✭ 30 (+3.45%)
Mutual labels: word2vec, lstm
Text-AnalysisExplaining textual analysis tools in Python. Including Preprocessing, Skip Gram (word2vec), and Topic Modelling.
Stars: ✭ 48 (+65.52%)
Mutual labels: word2vec, lstm