GermanwordembeddingsToolkit to obtain and preprocess german corpora, train models using word2vec (gensim) and evaluate them with generated testsets
Stars: ✭ 189 (+8%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+132.57%)
Glove As A Tensorflow Embedding LayerTaking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Stars: ✭ 85 (-51.43%)
lda2vecMixing Dirichlet Topic Models and Word Embeddings to Make lda2vec from this paper https://arxiv.org/abs/1605.02019
Stars: ✭ 27 (-84.57%)
Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+1733.71%)
BiosentvecBioWordVec & BioSentVec: pre-trained embeddings for biomedical words and sentences
Stars: ✭ 308 (+76%)
Nlp NotebooksA collection of notebooks for Natural Language Processing from NLP Town
Stars: ✭ 513 (+193.14%)
Text2vecFast vectorization, topic modeling, distances and GloVe word embeddings in R.
Stars: ✭ 715 (+308.57%)
Question GenerationGenerating multiple choice questions from text using Machine Learning.
Stars: ✭ 227 (+29.71%)
word-benchmarksBenchmarks for intrinsic word embeddings evaluation.
Stars: ✭ 45 (-74.29%)
word embeddingSample code for training Word2Vec and FastText using wiki corpus and their pretrained word embedding..
Stars: ✭ 21 (-88%)
SWDMSIGIR 2017: Embedding-based query expansion for weighted sequential dependence retrieval model
Stars: ✭ 35 (-80%)
wikidata-corpusTrain Wikidata with word2vec for word embedding tasks
Stars: ✭ 109 (-37.71%)
Word2vec訓練中文詞向量 Word2vec, Word2vec was created by a team of researchers led by Tomas Mikolov at Google.
Stars: ✭ 48 (-72.57%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-63.43%)
ExperimentsSome research experiments
Stars: ✭ 95 (-45.71%)
Dna2vecdna2vec: Consistent vector representations of variable-length k-mers
Stars: ✭ 117 (-33.14%)
Practical 1Oxford Deep NLP 2017 course - Practical 1: word2vec
Stars: ✭ 220 (+25.71%)
ScattertextBeautiful visualizations of how language differs among document types.
Stars: ✭ 1,722 (+884%)
AravecAraVec is a pre-trained distributed word representation (word embedding) open source project which aims to provide the Arabic NLP research community with free to use and powerful word embedding models.
Stars: ✭ 239 (+36.57%)
KoanA word2vec negative sampling implementation with correct CBOW update.
Stars: ✭ 232 (+32.57%)
word2vec-on-wikipediaA pipeline for training word embeddings using word2vec on wikipedia corpus.
Stars: ✭ 68 (-61.14%)
two-stream-cnnA two-stream convolutional neural network for learning abitrary similarity functions over two sets of training data
Stars: ✭ 24 (-86.29%)
word2vec-tsneGoogle News and Leo Tolstoy: Visualizing Word2Vec Word Embeddings using t-SNE.
Stars: ✭ 59 (-66.29%)
Chameleon recsysSource code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+15.43%)
codenamesCodenames AI using Word Vectors
Stars: ✭ 41 (-76.57%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-89.14%)
Text-AnalysisExplaining textual analysis tools in Python. Including Preprocessing, Skip Gram (word2vec), and Topic Modelling.
Stars: ✭ 48 (-72.57%)
Deep Math Machine Learning.aiA blog which talks about machine learning, deep learning algorithms and the Math. and Machine learning algorithms written from scratch.
Stars: ✭ 173 (-1.14%)
WegoWord Embeddings (e.g. Word2Vec) in Go!
Stars: ✭ 336 (+92%)
ShallowlearnAn experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
Stars: ✭ 196 (+12%)
ServenetService Classification based on Service Description
Stars: ✭ 21 (-88%)
Syntree2vecAn algorithm to augment syntactic hierarchy into word embeddings
Stars: ✭ 9 (-94.86%)
Word2vec Russian NovelsInspired by word2vec-pride-vis the replacement of words of Russian most valuable novels text with closest word2vec model words. By Boris Orekhov
Stars: ✭ 39 (-77.71%)
Average Word2vec🔤 Calculate average word embeddings (word2vec) from documents for transfer learning
Stars: ✭ 52 (-70.29%)
GensimTopic Modelling for Humans
Stars: ✭ 12,763 (+7193.14%)
Dict2vecDict2vec is a framework to learn word embeddings using lexical dictionaries.
Stars: ✭ 91 (-48%)
Concise Ipython Notebooks For Deep LearningIpython Notebooks for solving problems like classification, segmentation, generation using latest Deep learning algorithms on different publicly available text and image data-sets.
Stars: ✭ 23 (-86.86%)
Awesome Embedding ModelsA curated list of awesome embedding models tutorials, projects and communities.
Stars: ✭ 1,486 (+749.14%)
MagnitudeA fast, efficient universal vector embedding utility package.
Stars: ✭ 1,394 (+696.57%)
Text SummarizerPython Framework for Extractive Text Summarization
Stars: ✭ 96 (-45.14%)
Nlp In PracticeStarter code to solve real world text data problems. Includes: Gensim Word2Vec, phrase embeddings, Text Classification with Logistic Regression, word count with pyspark, simple text preprocessing, pre-trained embeddings and more.
Stars: ✭ 790 (+351.43%)
Postgres Word2vecutils to use word embedding like word2vec vectors in a postgres database
Stars: ✭ 96 (-45.14%)
Elmo TutorialA short tutorial on Elmo training (Pre trained, Training on new data, Incremental training)
Stars: ✭ 145 (-17.14%)
Log Anomaly DetectorLog Anomaly Detection - Machine learning to detect abnormal events logs
Stars: ✭ 169 (-3.43%)
Pytorch sacPyTorch implementation of Soft Actor-Critic (SAC)
Stars: ✭ 174 (-0.57%)
ItwmmIn The Wild 3D Morphable Models
Stars: ✭ 175 (+0%)