lda2vecMixing Dirichlet Topic Models and Word Embeddings to Make lda2vec from this paper https://arxiv.org/abs/1605.02019
Stars: ✭ 27 (-70.33%)
word2vec-tsneGoogle News and Leo Tolstoy: Visualizing Word2Vec Word Embeddings using t-SNE.
Stars: ✭ 59 (-35.16%)
Dna2vecdna2vec: Consistent vector representations of variable-length k-mers
Stars: ✭ 117 (+28.57%)
MagnitudeA fast, efficient universal vector embedding utility package.
Stars: ✭ 1,394 (+1431.87%)
EmbeddingsvizVisualize word embeddings of a vocabulary in TensorBoard, including the neighbors
Stars: ✭ 40 (-56.04%)
Embedding As ServiceOne-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (+65.93%)
Glove As A Tensorflow Embedding LayerTaking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Stars: ✭ 85 (-6.59%)
Chameleon recsysSource code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+121.98%)
SensegramMaking sense embedding out of word embeddings using graph-based word sense induction
Stars: ✭ 209 (+129.67%)
word-embeddings-from-scratchCreating word embeddings from scratch and visualize them on TensorBoard. Using trained embeddings in Keras.
Stars: ✭ 22 (-75.82%)
Text SummarizerPython Framework for Extractive Text Summarization
Stars: ✭ 96 (+5.49%)
Awesome Embedding ModelsA curated list of awesome embedding models tutorials, projects and communities.
Stars: ✭ 1,486 (+1532.97%)
Philo2vecAn implementation of word2vec applied to [stanford philosophy encyclopedia](http://plato.stanford.edu/)
Stars: ✭ 33 (-63.74%)
GermanwordembeddingsToolkit to obtain and preprocess german corpora, train models using word2vec (gensim) and evaluate them with generated testsets
Stars: ✭ 189 (+107.69%)
GensimTopic Modelling for Humans
Stars: ✭ 12,763 (+13925.27%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-78.02%)
word-benchmarksBenchmarks for intrinsic word embeddings evaluation.
Stars: ✭ 45 (-50.55%)
SentimentAnalysis(BOW, TF-IDF, Word2Vec, BERT) Word Embeddings + (SVM, Naive Bayes, Decision Tree, Random Forest) Base Classifiers + Pre-trained BERT on Tensorflow Hub + 1-D CNN and Bi-Directional LSTM on IMDB Movie Reviews Dataset
Stars: ✭ 40 (-56.04%)
SentimentAnalysisSentiment Analysis: Deep Bi-LSTM+attention model
Stars: ✭ 32 (-64.84%)
wikidata-corpusTrain Wikidata with word2vec for word embedding tasks
Stars: ✭ 109 (+19.78%)
codenamesCodenames AI using Word Vectors
Stars: ✭ 41 (-54.95%)
SWDMSIGIR 2017: Embedding-based query expansion for weighted sequential dependence retrieval model
Stars: ✭ 35 (-61.54%)
Postgres Word2vecutils to use word embedding like word2vec vectors in a postgres database
Stars: ✭ 96 (+5.49%)
Vec4irWord Embeddings for Information Retrieval
Stars: ✭ 188 (+106.59%)
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+102.2%)
ScattertextBeautiful visualizations of how language differs among document types.
Stars: ✭ 1,722 (+1792.31%)
cadeCompass-aligned Distributional Embeddings. Align embeddings from different corpora
Stars: ✭ 29 (-68.13%)
Hash EmbeddingsPyTorch implementation of Hash Embeddings (NIPS 2017). Submission to the NIPS Implementation Challenge.
Stars: ✭ 126 (+38.46%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-29.67%)
DebiasweRemove problematic gender bias from word embeddings.
Stars: ✭ 175 (+92.31%)
ShallowlearnAn experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
Stars: ✭ 196 (+115.38%)
Entity2recentity2rec generates item recommendation using property-specific knowledge graph embeddings
Stars: ✭ 159 (+74.73%)
KoanA word2vec negative sampling implementation with correct CBOW update.
Stars: ✭ 232 (+154.95%)
Cw2veccw2vec: Learning Chinese Word Embeddings with Stroke n-gram Information
Stars: ✭ 224 (+146.15%)
two-stream-cnnA two-stream convolutional neural network for learning abitrary similarity functions over two sets of training data
Stars: ✭ 24 (-73.63%)
FastrtextR wrapper for fastText
Stars: ✭ 103 (+13.19%)
navecCompact high quality word embeddings for Russian language
Stars: ✭ 118 (+29.67%)
go2vecRead and use word2vec vectors in Go
Stars: ✭ 44 (-51.65%)
WegoWord Embeddings (e.g. Word2Vec) in Go!
Stars: ✭ 336 (+269.23%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-79.12%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+347.25%)
word embeddingSample code for training Word2Vec and FastText using wiki corpus and their pretrained word embedding..
Stars: ✭ 21 (-76.92%)
word2vec-on-wikipediaA pipeline for training word embeddings using word2vec on wikipedia corpus.
Stars: ✭ 68 (-25.27%)
game2vecTensorFlow implementation of word2vec applied on https://www.kaggle.com/tamber/steam-video-games dataset, using both CBOW and Skip-gram.
Stars: ✭ 62 (-31.87%)
Text-AnalysisExplaining textual analysis tools in Python. Including Preprocessing, Skip Gram (word2vec), and Topic Modelling.
Stars: ✭ 48 (-47.25%)
VectorhubVector Hub - Library for easy discovery, and consumption of State-of-the-art models to turn data into vectors. (text2vec, image2vec, video2vec, graph2vec, bert, inception, etc)
Stars: ✭ 317 (+248.35%)
PersianNERNamed-Entity Recognition in Persian Language
Stars: ✭ 48 (-47.25%)
reachLoad embeddings and featurize your sentences.
Stars: ✭ 17 (-81.32%)
Lmdb EmbeddingsFast word vectors with little memory usage in Python
Stars: ✭ 404 (+343.96%)
Text2vecFast vectorization, topic modeling, distances and GloVe word embeddings in R.
Stars: ✭ 715 (+685.71%)
Graph 2d cnnCode and data for the paper 'Classifying Graphs as Images with Convolutional Neural Networks' (new title: 'Graph Classification with 2D Convolutional Neural Networks')
Stars: ✭ 67 (-26.37%)
TadwAn implementation of "Network Representation Learning with Rich Text Information" (IJCAI '15).
Stars: ✭ 43 (-52.75%)