lda2vecMixing Dirichlet Topic Models and Word Embeddings to Make lda2vec from this paper https://arxiv.org/abs/1605.02019
Stars: ✭ 27 (-76.92%)
word2vec-tsneGoogle News and Leo Tolstoy: Visualizing Word2Vec Word Embeddings using t-SNE.
Stars: ✭ 59 (-49.57%)
MagnitudeA fast, efficient universal vector embedding utility package.
Stars: ✭ 1,394 (+1091.45%)
Dict2vecDict2vec is a framework to learn word embeddings using lexical dictionaries.
Stars: ✭ 91 (-22.22%)
SensegramMaking sense embedding out of word embeddings using graph-based word sense induction
Stars: ✭ 209 (+78.63%)
two-stream-cnnA two-stream convolutional neural network for learning abitrary similarity functions over two sets of training data
Stars: ✭ 24 (-79.49%)
PersianNERNamed-Entity Recognition in Persian Language
Stars: ✭ 48 (-58.97%)
SentimentAnalysis(BOW, TF-IDF, Word2Vec, BERT) Word Embeddings + (SVM, Naive Bayes, Decision Tree, Random Forest) Base Classifiers + Pre-trained BERT on Tensorflow Hub + 1-D CNN and Bi-Directional LSTM on IMDB Movie Reviews Dataset
Stars: ✭ 40 (-65.81%)
SentimentAnalysisSentiment Analysis: Deep Bi-LSTM+attention model
Stars: ✭ 32 (-72.65%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-83.76%)
codenamesCodenames AI using Word Vectors
Stars: ✭ 41 (-64.96%)
GermanwordembeddingsToolkit to obtain and preprocess german corpora, train models using word2vec (gensim) and evaluate them with generated testsets
Stars: ✭ 189 (+61.54%)
ShallowlearnAn experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
Stars: ✭ 196 (+67.52%)
word-embeddings-from-scratchCreating word embeddings from scratch and visualize them on TensorBoard. Using trained embeddings in Keras.
Stars: ✭ 22 (-81.2%)
Cw2veccw2vec: Learning Chinese Word Embeddings with Stroke n-gram Information
Stars: ✭ 224 (+91.45%)
FastrtextR wrapper for fastText
Stars: ✭ 103 (-11.97%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-82.91%)
wikidata-corpusTrain Wikidata with word2vec for word embedding tasks
Stars: ✭ 109 (-6.84%)
word2vec-on-wikipediaA pipeline for training word embeddings using word2vec on wikipedia corpus.
Stars: ✭ 68 (-41.88%)
game2vecTensorFlow implementation of word2vec applied on https://www.kaggle.com/tamber/steam-video-games dataset, using both CBOW and Skip-gram.
Stars: ✭ 62 (-47.01%)
cadeCompass-aligned Distributional Embeddings. Align embeddings from different corpora
Stars: ✭ 29 (-75.21%)
HubA library for transfer learning by reusing parts of TensorFlow models.
Stars: ✭ 3,007 (+2470.09%)
Lmdb EmbeddingsFast word vectors with little memory usage in Python
Stars: ✭ 404 (+245.3%)
CleoraCleora AI is a general-purpose model for efficient, scalable learning of stable and inductive entity embeddings for heterogeneous relational data.
Stars: ✭ 303 (+158.97%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+247.86%)
Philo2vecAn implementation of word2vec applied to [stanford philosophy encyclopedia](http://plato.stanford.edu/)
Stars: ✭ 33 (-71.79%)
DebiasweRemove problematic gender bias from word embeddings.
Stars: ✭ 175 (+49.57%)
Entity2recentity2rec generates item recommendation using property-specific knowledge graph embeddings
Stars: ✭ 159 (+35.9%)
Chameleon recsysSource code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+72.65%)
GensimTopic Modelling for Humans
Stars: ✭ 12,763 (+10808.55%)
KoanA word2vec negative sampling implementation with correct CBOW update.
Stars: ✭ 232 (+98.29%)
Embedding As ServiceOne-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (+29.06%)
navecCompact high quality word embeddings for Russian language
Stars: ✭ 118 (+0.85%)
word-benchmarksBenchmarks for intrinsic word embeddings evaluation.
Stars: ✭ 45 (-61.54%)
Awesome Embedding ModelsA curated list of awesome embedding models tutorials, projects and communities.
Stars: ✭ 1,486 (+1170.09%)
Keras-Application-ZooReference implementations of popular DL models missing from keras-applications & keras-contrib
Stars: ✭ 31 (-73.5%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-45.3%)
Text-AnalysisExplaining textual analysis tools in Python. Including Preprocessing, Skip Gram (word2vec), and Topic Modelling.
Stars: ✭ 48 (-58.97%)
go2vecRead and use word2vec vectors in Go
Stars: ✭ 44 (-62.39%)
reachLoad embeddings and featurize your sentences.
Stars: ✭ 17 (-85.47%)
WegoWord Embeddings (e.g. Word2Vec) in Go!
Stars: ✭ 336 (+187.18%)
VectorhubVector Hub - Library for easy discovery, and consumption of State-of-the-art models to turn data into vectors. (text2vec, image2vec, video2vec, graph2vec, bert, inception, etc)
Stars: ✭ 317 (+170.94%)
Text2vecFast vectorization, topic modeling, distances and GloVe word embeddings in R.
Stars: ✭ 715 (+511.11%)
SWDMSIGIR 2017: Embedding-based query expansion for weighted sequential dependence retrieval model
Stars: ✭ 35 (-70.09%)
Text SummarizerPython Framework for Extractive Text Summarization
Stars: ✭ 96 (-17.95%)
Postgres Word2vecutils to use word embedding like word2vec vectors in a postgres database
Stars: ✭ 96 (-17.95%)
EmbeddingsvizVisualize word embeddings of a vocabulary in TensorBoard, including the neighbors
Stars: ✭ 40 (-65.81%)
Vec4irWord Embeddings for Information Retrieval
Stars: ✭ 188 (+60.68%)
ScattertextBeautiful visualizations of how language differs among document types.
Stars: ✭ 1,722 (+1371.79%)
word embeddingSample code for training Word2Vec and FastText using wiki corpus and their pretrained word embedding..
Stars: ✭ 21 (-82.05%)
Russian news corpusRussian mass media stemmed texts corpus / Корпус лемматизированных (морфологически нормализованных) текстов российских СМИ
Stars: ✭ 76 (-35.04%)
Glove As A Tensorflow Embedding LayerTaking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Stars: ✭ 85 (-27.35%)