Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+10847.83%)
Nlp JourneyDocuments, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+5508.7%)
Embedding As ServiceOne-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (+556.52%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (+8.7%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1673.91%)
Lmdb EmbeddingsFast word vectors with little memory usage in Python
Stars: ✭ 404 (+1656.52%)
MagnitudeA fast, efficient universal vector embedding utility package.
Stars: ✭ 1,394 (+5960.87%)
CLUE pytorchCLUE baseline pytorch CLUE的pytorch版本基线
Stars: ✭ 72 (+213.04%)
Nlp researchNLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (+513.04%)
bert in a flaskA dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (+39.13%)
TabFormerCode & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+808.7%)
semantic-document-relationsImplementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-8.7%)
ADL2019Applied Deep Learning (2019 Spring) @ NTU
Stars: ✭ 20 (-13.04%)
Text classificationall kinds of text classification models and more with deep learning
Stars: ✭ 7,179 (+31113.04%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+4204.35%)
FasterTransformerTransformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+6730.43%)
Albert zhA LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Stars: ✭ 3,500 (+15117.39%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+242256.52%)
keras-bert-nerKeras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALBERT
Stars: ✭ 7 (-69.57%)
Natural Language ProcessingProgramming Assignments and Lectures for Stanford's CS 224: Natural Language Processing with Deep Learning
Stars: ✭ 377 (+1539.13%)
Ngram2vecFour word embedding models implemented in Python. Supporting arbitrary context features
Stars: ✭ 703 (+2956.52%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+426.09%)
sticker2Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-39.13%)
PycadlPython package with source code from the course "Creative Applications of Deep Learning w/ TensorFlow"
Stars: ✭ 356 (+1447.83%)
Nlp chinese corpus大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
Stars: ✭ 6,656 (+28839.13%)
ServenetService Classification based on Service Description
Stars: ✭ 21 (-8.7%)
WegoWord Embeddings (e.g. Word2Vec) in Go!
Stars: ✭ 336 (+1360.87%)
Text2vecFast vectorization, topic modeling, distances and GloVe word embeddings in R.
Stars: ✭ 715 (+3008.7%)
german-sentimentA data set and model for german sentiment classification.
Stars: ✭ 37 (+60.87%)
Glove As A Tensorflow Embedding LayerTaking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Stars: ✭ 85 (+269.57%)
VectorsinsearchDice.com repo to accompany the dice.com 'Vectors in Search' talk by Simon Hughes, from the Activate 2018 search conference, and the 'Searching with Vectors' talk from Haystack 2019 (US). Builds upon my conceptual search and semantic search work from 2015
Stars: ✭ 71 (+208.7%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (+178.26%)
TextclfTextClf :基于Pytorch/Sklearn的文本分类框架,包括逻辑回归、SVM、TextCNN、TextRNN、TextRCNN、DRNN、DPCNN、Bert等多种模型,通过简单配置即可完成数据处理、模型训练、测试等过程。
Stars: ✭ 105 (+356.52%)
Fasttext.jsFastText for Node.js
Stars: ✭ 127 (+452.17%)
wordfish-pythonextract relationships from standardized terms from corpus of interest with deep learning 🐟
Stars: ✭ 19 (-17.39%)
Repo 2017Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+4782.61%)
Nlp兜哥出品 <一本开源的NLP入门书籍>
Stars: ✭ 1,677 (+7191.3%)
Fasttext4j Implementing Facebook's FastText with java
Stars: ✭ 148 (+543.48%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (+147.83%)
ShallowlearnAn experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
Stars: ✭ 196 (+752.17%)
WordvectorsPre-trained word vectors of 30+ languages
Stars: ✭ 2,043 (+8782.61%)
Cw2veccw2vec: Learning Chinese Word Embeddings with Stroke n-gram Information
Stars: ✭ 224 (+873.91%)
AiSpaceAiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
Stars: ✭ 28 (+21.74%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+152.17%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-4.35%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-52.17%)
GensimTopic Modelling for Humans
Stars: ✭ 12,763 (+55391.3%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (+365.22%)