Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+9624.24%)
Spanish Word EmbeddingsSpanish word embeddings computed with different methods and from different corpora
Stars: ✭ 236 (+615.15%)
KoanA word2vec negative sampling implementation with correct CBOW update.
Stars: ✭ 232 (+603.03%)
WordgcnACL 2019: Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional Networks
Stars: ✭ 230 (+596.97%)
Question GenerationGenerating multiple choice questions from text using Machine Learning.
Stars: ✭ 227 (+587.88%)
Chameleon recsysSource code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+512.12%)
ShallowlearnAn experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
Stars: ✭ 196 (+493.94%)
JfasttextJava interface for fastText
Stars: ✭ 193 (+484.85%)
GermanwordembeddingsToolkit to obtain and preprocess german corpora, train models using word2vec (gensim) and evaluate them with generated testsets
Stars: ✭ 189 (+472.73%)
Vec4irWord Embeddings for Information Retrieval
Stars: ✭ 188 (+469.7%)
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+457.58%)
TextheroText preprocessing, representation and visualization from zero to hero.
Stars: ✭ 2,407 (+7193.94%)
DebiasweRemove problematic gender bias from word embeddings.
Stars: ✭ 175 (+430.3%)
Sifrank zh基于预训练模型的中文关键词抽取方法(论文SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model 的中文版代码)
Stars: ✭ 175 (+430.3%)
LftmImproving topic models LDA and DMM (one-topic-per-document model for short texts) with word embeddings (TACL 2015)
Stars: ✭ 168 (+409.09%)
GensimTopic Modelling for Humans
Stars: ✭ 12,763 (+38575.76%)
MimickCode for Mimicking Word Embeddings using Subword RNNs (EMNLP 2017)
Stars: ✭ 152 (+360.61%)
Elmo TutorialA short tutorial on Elmo training (Pre trained, Training on new data, Incremental training)
Stars: ✭ 145 (+339.39%)
Fasttext.jsFastText for Node.js
Stars: ✭ 127 (+284.85%)
Hash EmbeddingsPyTorch implementation of Hash Embeddings (NIPS 2017). Submission to the NIPS Implementation Challenge.
Stars: ✭ 126 (+281.82%)
ScattertextBeautiful visualizations of how language differs among document types.
Stars: ✭ 1,722 (+5118.18%)
Dna2vecdna2vec: Consistent vector representations of variable-length k-mers
Stars: ✭ 117 (+254.55%)
FlairA very simple framework for state-of-the-art Natural Language Processing (NLP)
Stars: ✭ 11,065 (+33430.3%)
DanlpDaNLP is a repository for Natural Language Processing resources for the Danish Language.
Stars: ✭ 111 (+236.36%)
KadotKadot, the unsupervised natural language processing library.
Stars: ✭ 108 (+227.27%)
Easy BertA Dead Simple BERT API for Python and Java (https://github.com/google-research/bert)
Stars: ✭ 106 (+221.21%)
MagnitudeA fast, efficient universal vector embedding utility package.
Stars: ✭ 1,394 (+4124.24%)
FastrtextR wrapper for fastText
Stars: ✭ 103 (+212.12%)
Text SummarizerPython Framework for Extractive Text Summarization
Stars: ✭ 96 (+190.91%)
Postgres Word2vecutils to use word embedding like word2vec vectors in a postgres database
Stars: ✭ 96 (+190.91%)
Dict2vecDict2vec is a framework to learn word embeddings using lexical dictionaries.
Stars: ✭ 91 (+175.76%)
Glove As A Tensorflow Embedding LayerTaking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Stars: ✭ 85 (+157.58%)
ClustercatFast Word Clustering Software
Stars: ✭ 65 (+96.97%)
Textblob ArArabic support for textblob
Stars: ✭ 60 (+81.82%)
Nlp overviewOverview of Modern Deep Learning Techniques Applied to Natural Language Processing
Stars: ✭ 1,104 (+3245.45%)
Lstm Context EmbeddingsAugmenting word embeddings with their surrounding context using bidirectional RNN
Stars: ✭ 57 (+72.73%)
Average Word2vec🔤 Calculate average word embeddings (word2vec) from documents for transfer learning
Stars: ✭ 52 (+57.58%)
EmbeddingsvizVisualize word embeddings of a vocabulary in TensorBoard, including the neighbors
Stars: ✭ 40 (+21.21%)
Top2vecTop2Vec learns jointly embedded topic, document and word vectors.
Stars: ✭ 972 (+2845.45%)