datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-66.1%)
SentimentAnalysisSentiment Analysis: Deep Bi-LSTM+attention model
Stars: ✭ 32 (-45.76%)
lda2vecMixing Dirichlet Topic Models and Word Embeddings to Make lda2vec from this paper https://arxiv.org/abs/1605.02019
Stars: ✭ 27 (-54.24%)
Dict2vecDict2vec is a framework to learn word embeddings using lexical dictionaries.
Stars: ✭ 91 (+54.24%)
Fasttext.jsFastText for Node.js
Stars: ✭ 127 (+115.25%)
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+211.86%)
MagnitudeA fast, efficient universal vector embedding utility package.
Stars: ✭ 1,394 (+2262.71%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-67.8%)
Dna2vecdna2vec: Consistent vector representations of variable-length k-mers
Stars: ✭ 117 (+98.31%)
biovecProtVec can be used in protein interaction predictions, structure prediction, and protein data visualization.
Stars: ✭ 23 (-61.02%)
Text-AnalysisExplaining textual analysis tools in Python. Including Preprocessing, Skip Gram (word2vec), and Topic Modelling.
Stars: ✭ 48 (-18.64%)
game2vecTensorFlow implementation of word2vec applied on https://www.kaggle.com/tamber/steam-video-games dataset, using both CBOW and Skip-gram.
Stars: ✭ 62 (+5.08%)
go2vecRead and use word2vec vectors in Go
Stars: ✭ 44 (-25.42%)
WegoWord Embeddings (e.g. Word2Vec) in Go!
Stars: ✭ 336 (+469.49%)
SentimentAnalysis(BOW, TF-IDF, Word2Vec, BERT) Word Embeddings + (SVM, Naive Bayes, Decision Tree, Random Forest) Base Classifiers + Pre-trained BERT on Tensorflow Hub + 1-D CNN and Bi-Directional LSTM on IMDB Movie Reviews Dataset
Stars: ✭ 40 (-32.2%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (+8.47%)
Russian news corpusRussian mass media stemmed texts corpus / Корпус лемматизированных (морфологически нормализованных) текстов российских СМИ
Stars: ✭ 76 (+28.81%)
Text SummarizerPython Framework for Extractive Text Summarization
Stars: ✭ 96 (+62.71%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+589.83%)
Glove As A Tensorflow Embedding LayerTaking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Stars: ✭ 85 (+44.07%)
Repo 2016R, Python and Mathematica Codes in Machine Learning, Deep Learning, Artificial Intelligence, NLP and Geolocation
Stars: ✭ 103 (+74.58%)
Awesome Embedding ModelsA curated list of awesome embedding models tutorials, projects and communities.
Stars: ✭ 1,486 (+2418.64%)
Entity2recentity2rec generates item recommendation using property-specific knowledge graph embeddings
Stars: ✭ 159 (+169.49%)
sent2vecHow to encode sentences in a high-dimensional vector space, a.k.a., sentence embedding.
Stars: ✭ 99 (+67.8%)
cadeCompass-aligned Distributional Embeddings. Align embeddings from different corpora
Stars: ✭ 29 (-50.85%)
reachLoad embeddings and featurize your sentences.
Stars: ✭ 17 (-71.19%)
VectorhubVector Hub - Library for easy discovery, and consumption of State-of-the-art models to turn data into vectors. (text2vec, image2vec, video2vec, graph2vec, bert, inception, etc)
Stars: ✭ 317 (+437.29%)
DebiasweRemove problematic gender bias from word embeddings.
Stars: ✭ 175 (+196.61%)
ShallowlearnAn experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
Stars: ✭ 196 (+232.2%)
Lmdb EmbeddingsFast word vectors with little memory usage in Python
Stars: ✭ 404 (+584.75%)
SWDMSIGIR 2017: Embedding-based query expansion for weighted sequential dependence retrieval model
Stars: ✭ 35 (-40.68%)
Philo2vecAn implementation of word2vec applied to [stanford philosophy encyclopedia](http://plato.stanford.edu/)
Stars: ✭ 33 (-44.07%)
navecCompact high quality word embeddings for Russian language
Stars: ✭ 118 (+100%)
Text2vecFast vectorization, topic modeling, distances and GloVe word embeddings in R.
Stars: ✭ 715 (+1111.86%)
Postgres Word2vecutils to use word embedding like word2vec vectors in a postgres database
Stars: ✭ 96 (+62.71%)
KoanA word2vec negative sampling implementation with correct CBOW update.
Stars: ✭ 232 (+293.22%)
word embeddingSample code for training Word2Vec and FastText using wiki corpus and their pretrained word embedding..
Stars: ✭ 21 (-64.41%)
GensimTopic Modelling for Humans
Stars: ✭ 12,763 (+21532.2%)
Embedding As ServiceOne-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (+155.93%)
GermanwordembeddingsToolkit to obtain and preprocess german corpora, train models using word2vec (gensim) and evaluate them with generated testsets
Stars: ✭ 189 (+220.34%)
Chameleon recsysSource code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+242.37%)
Word2VecAndTsneScripts demo-ing how to train a Word2Vec model and reduce its vector space
Stars: ✭ 45 (-23.73%)
Cw2veccw2vec: Learning Chinese Word Embeddings with Stroke n-gram Information
Stars: ✭ 224 (+279.66%)
SensegramMaking sense embedding out of word embeddings using graph-based word sense induction
Stars: ✭ 209 (+254.24%)
word-embeddings-from-scratchCreating word embeddings from scratch and visualize them on TensorBoard. Using trained embeddings in Keras.
Stars: ✭ 22 (-62.71%)
ScattertextBeautiful visualizations of how language differs among document types.
Stars: ✭ 1,722 (+2818.64%)
DeepLearningReadingDeep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+32.2%)
Entity EmbeddingReference implementation of the paper "Word Embeddings for Entity-annotated Texts"
Stars: ✭ 19 (-67.8%)
word2vec-on-wikipediaA pipeline for training word embeddings using word2vec on wikipedia corpus.
Stars: ✭ 68 (+15.25%)
empythyAutomated NLP sentiment predictions- batteries included, or use your own data
Stars: ✭ 17 (-71.19%)
PersianNERNamed-Entity Recognition in Persian Language
Stars: ✭ 48 (-18.64%)
word-benchmarksBenchmarks for intrinsic word embeddings evaluation.
Stars: ✭ 45 (-23.73%)
codenamesCodenames AI using Word Vectors
Stars: ✭ 41 (-30.51%)