All Projects → microsoft → Msr Nlp Projects

microsoft / Msr Nlp Projects

Licence: other
This is a list of open-source projects at Microsoft Research NLP Group

Projects that are alternatives of or similar to Msr Nlp Projects

Nndial
NNDial is an open source toolkit for building end-to-end trainable task-oriented dialogue models. It is released by Tsung-Hsien (Shawn) Wen from Cambridge Dialogue Systems Group under Apache License 2.0.
Stars: ✭ 332 (+260.87%)
Mutual labels:  dialogue, natural-language-processing
Tod Bert
Pre-Trained Models for ToD-BERT
Stars: ✭ 143 (+55.43%)
Mutual labels:  dialogue, natural-language-processing
Trade Dst
Source code for transferable dialogue state generator (TRADE, Wu et al., 2019). https://arxiv.org/abs/1905.08743
Stars: ✭ 287 (+211.96%)
Mutual labels:  dialogue, natural-language-processing
Nlp Progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
Stars: ✭ 19,518 (+21115.22%)
Mutual labels:  dialogue, natural-language-processing
Multiwoz
Source code for end-to-end dialogue model from the MultiWOZ paper (Budzianowski et al. 2018, EMNLP)
Stars: ✭ 384 (+317.39%)
Mutual labels:  dialogue, natural-language-processing
Nlg Eval
Evaluation code for various unsupervised automated metrics for Natural Language Generation.
Stars: ✭ 822 (+793.48%)
Mutual labels:  dialogue, natural-language-processing
Rnnlg
RNNLG is an open source benchmark toolkit for Natural Language Generation (NLG) in spoken dialogue system application domains. It is released by Tsung-Hsien (Shawn) Wen from Cambridge Dialogue Systems Group under Apache License 2.0.
Stars: ✭ 487 (+429.35%)
Mutual labels:  dialogue, natural-language-processing
Dialogue Understanding
This repository contains PyTorch implementation for the baseline models from the paper Utterance-level Dialogue Understanding: An Empirical Study
Stars: ✭ 77 (-16.3%)
Mutual labels:  dialogue, natural-language-processing
Neural kbqa
Knowledge Base Question Answering using memory networks
Stars: ✭ 87 (-5.43%)
Mutual labels:  natural-language-processing
Meprop
meProp: Sparsified Back Propagation for Accelerated Deep Learning (ICML 2017)
Stars: ✭ 90 (-2.17%)
Mutual labels:  natural-language-processing
Semantic Texual Similarity Toolkits
Semantic Textual Similarity (STS) measures the degree of equivalence in the underlying semantics of paired snippets of text.
Stars: ✭ 87 (-5.43%)
Mutual labels:  natural-language-processing
Spark Nlp Models
Models and Pipelines for the Spark NLP library
Stars: ✭ 88 (-4.35%)
Mutual labels:  natural-language-processing
Uer Py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Stars: ✭ 1,295 (+1307.61%)
Mutual labels:  natural-language-processing
Spf
Cornell Semantic Parsing Framework
Stars: ✭ 87 (-5.43%)
Mutual labels:  natural-language-processing
Deep Learning Drizzle
Drench yourself in Deep Learning, Reinforcement Learning, Machine Learning, Computer Vision, and NLP by learning from these exciting lectures!!
Stars: ✭ 9,717 (+10461.96%)
Mutual labels:  natural-language-processing
Ml
A high-level machine learning and deep learning library for the PHP language.
Stars: ✭ 1,270 (+1280.43%)
Mutual labels:  natural-language-processing
Turkish Bert Nlp Pipeline
Bert-base NLP pipeline for Turkish, Ner, Sentiment Analysis, Question Answering etc.
Stars: ✭ 85 (-7.61%)
Mutual labels:  natural-language-processing
Geotext
Geotext extracts country and city mentions from text
Stars: ✭ 91 (-1.09%)
Mutual labels:  natural-language-processing
Applied Ml
📚 Papers & tech blogs by companies sharing their work on data science & machine learning in production.
Stars: ✭ 17,824 (+19273.91%)
Mutual labels:  natural-language-processing
Bible text gcn
Pytorch implementation of "Graph Convolutional Networks for Text Classification"
Stars: ✭ 90 (-2.17%)
Mutual labels:  natural-language-processing

Microsoft Research NLP Projects

This is a list of open-sourced projects Microsoft Research NLP Group involved. (ranked in time order)

Datasets

Title Description Related projects
Dialogue Feedback Dataset 100+ Millions of dialogues with corresponding human feedback to learn which one gets better feedback DialogRPT
Grounded Dialogue Dataset Dialogues with information grounded in external knowledge, e.g. wikipedia pages DSTC7, CMR
Reddit Dialogue Dataset 147M conversation-like exchanges extracted from Reddit comment chains over a period spanning from 2005 through 2017 DialoGPT

Papers

Title Links Notes Tags
Dialogue Response Ranking Training with Large-Scale Human Feedback Data code/model/data, demo EMNLP 2020 dialog ranking
POINTER: Constrained Text Generation via Insertion-based Generative Pre-training code, demo EMNLP 2020 generation
Optimus: Organizing Sentences via Pre-trained Modeling of a Latent Space code, demo EMNLP 2020 generation
RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers code ACL 2020 parsing, sql
A Recipe for Creating Multimodal Aligned Datasets for Sequential Tasks code ACL 2020 multimodal
INSET: Sentence Infilling with INter-SEntential Transformer code/demo ACL 2020 generation
DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation code/model/data ACL 2020 dialog generation
MixingBoard: a Knowledgeable Stylized Integrated Text Generation Platform code ACL 2020 dialog generation framework knowledge style
Vision-based Navigation with Language-based Assistance via Imitation Learning with Indirect Intervention code/data CVPR 2019 navigation imitation learning
Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading code/model/data ACL 2019 knowledge dialog generation
Microsoft Icecaps: An Open-Source Toolkit for Conversation Modeling code ACL 2019 dialog generation framework
Structuring Latent Spaces for Stylized Response Generation code/data EMNLP 2019 style dialog generation
Jointly Optimizing Diversity and Relevance in Neural Response Generation code/data NAACL 2019 dialog generation
Towards Content Transfer through Grounded Text Generation code/data NAACL 2019 generation knowledge

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Legal Notices

Microsoft and any contributors grant you a license to the Microsoft documentation and other content in this repository under the Creative Commons Attribution 4.0 International Public License, see the LICENSE file, and grant you a license to any code in the repository under the MIT License, see the LICENSE-CODE file.

Microsoft, Windows, Microsoft Azure and/or other Microsoft products and services referenced in the documentation may be either trademarks or registered trademarks of Microsoft in the United States and/or other countries. The licenses for this project do not grant you rights to use any Microsoft names, logos, or trademarks. Microsoft's general trademark guidelines can be found at http://go.microsoft.com/fwlink/?LinkID=254653.

Privacy information can be found at https://privacy.microsoft.com/en-us/

Microsoft and any contributors reserve all other rights, whether under their respective copyrights, patents, or trademarks, whether by implication, estoppel or otherwise.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].