All Categories → Machine Learning → natural-language-processing

Top 1096 natural-language-processing open source projects

Deep Learning Interview Book
深度学习面试宝典(含数学、机器学习、深度学习、计算机视觉、自然语言处理和SLAM等方向)
Prose
📖 A Golang library for text processing, including tokenization, part-of-speech tagging, and named-entity extraction.
Book Nlp
Natural language processing pipeline for book-length documents
Good Papers
I try my best to keep updated cutting-edge knowledge in Machine Learning/Deep Learning and Natural Language Processing. These are my notes on some good papers
Low Resource Languages
Resources for conservation, development, and documentation of low resource (human) languages.
Awesome Grounding
awesome grounding: A curated list of research papers in visual grounding
Soulvercore
A powerful Swift framework for evaluating mathematical expressions
Dont Stop Pretraining
Code associated with the Don't Stop Pretraining ACL 2020 paper
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Tensorflow qrnn
QRNN implementation for TensorFlow
Cmrc2018
A Span-Extraction Dataset for Chinese Machine Reading Comprehension (CMRC 2018)
Malaya
Natural Language Toolkit for bahasa Malaysia, https://malaya.readthedocs.io/
Pykakasi
NLP: Convert Japanese Kana-kanji sentences into Kana-Roman in simple algorithm.
Chazutsu
The tool to make NLP datasets ready to use
Pytorch Bert Crf Ner
KoBERT와 CRF로 만든 한국어 개체명인식기 (BERT+CRF based Named Entity Recognition model for Korean)
Mitie
MITIE: library and tools for information extraction
Spacy Services
💫 REST microservices for various spaCy-related tasks
Wordgcn
ACL 2019: Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional Networks
Pytorch Transformers Classification
Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
Catalyst
🚀 Catalyst is a C# Natural Language Processing library built for speed. Inspired by spaCy's design, it brings pre-trained models, out-of-the box support for training word and document embeddings, and flexible entity recognition models.
Text summarization with tensorflow
Implementation of a seq2seq model for summarization of textual data. Demonstrated on amazon reviews, github issues and news articles.
Reside
EMNLP 2018: RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information
Bert4doc Classification
Code and source for paper ``How to Fine-Tune BERT for Text Classification?``
Practical 1
Oxford Deep NLP 2017 course - Practical 1: word2vec
Awesome Nlprojects
List of projects related to Natural Language Processing (NLP) that make a geek smile for they exist
Awesome Financial Nlp
Researches for Natural Language Processing for Financial Domain
Lit
The Language Interpretability Tool: Interactively analyze NLP models for model understanding in an extensible and framework agnostic interface.
Visdial
[CVPR 2017] Torch code for Visual Dialog
Spacy Lookup
Named Entity Recognition based on dictionaries
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Nlp Roadmap
ROADMAP(Mind Map) and KEYWORD for students those who have interest in learning NLP
Shifterator
Interpretable data visualizations for understanding how texts differ at the word level
Kagnet
Knowledge-Aware Graph Networks for Commonsense Reasoning (EMNLP-IJCNLP 19)
Conllu
A CoNLL-U parser that takes a CoNLL-U formatted string and turns it into a nested python dictionary.
Hardware Aware Transformers
[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
Minerva
Meandering In Networks of Entities to Reach Verisimilar Answers
Pytorch Beam Search Decoding
PyTorch implementation of beam search decoding for seq2seq models
Aind Nlp
Coding exercises for the Natural Language Processing concentration, part of Udacity's AIND program.
Attention Mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
1-60 of 1096 natural-language-processing projects