All Projects → huggingface → Torchmoji

huggingface / Torchmoji

Licence: mit
😇A pyTorch implementation of the DeepMoji model: state-of-the-art deep learning model for analyzing sentiment, emotion, sarcasm etc

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Torchmoji

Cs224n
CS224n: Natural Language Processing with Deep Learning Assignments Winter, 2017
Stars: ✭ 656 (-17.48%)
Mutual labels:  natural-language-processing
Bert
TensorFlow code and pre-trained models for BERT
Stars: ✭ 29,971 (+3669.94%)
Mutual labels:  natural-language-processing
Ecco
Visualize and explore NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2).
Stars: ✭ 723 (-9.06%)
Mutual labels:  natural-language-processing
Language Detection
A language detection library for PHP. Detects the language from a given text string.
Stars: ✭ 665 (-16.35%)
Mutual labels:  natural-language-processing
Bertsearch
Elasticsearch with BERT for advanced document search.
Stars: ✭ 684 (-13.96%)
Mutual labels:  natural-language-processing
Ai Series
📚 [.md & .ipynb] Series of Artificial Intelligence & Deep Learning, including Mathematics Fundamentals, Python Practices, NLP Application, etc. 💫 人工智能与深度学习实战,数理统计篇 | 机器学习篇 | 深度学习篇 | 自然语言处理篇 | 工具实践 Scikit & Tensoflow & PyTorch 篇 | 行业应用 & 课程笔记
Stars: ✭ 702 (-11.7%)
Mutual labels:  natural-language-processing
Awesome Relation Extraction
📖 A curated list of awesome resources dedicated to Relation Extraction, one of the most important tasks in Natural Language Processing (NLP).
Stars: ✭ 656 (-17.48%)
Mutual labels:  natural-language-processing
Coursera
Quiz & Assignment of Coursera
Stars: ✭ 774 (-2.64%)
Mutual labels:  natural-language-processing
Madewithml
Learn how to responsibly deliver value with ML.
Stars: ✭ 29,253 (+3579.62%)
Mutual labels:  natural-language-processing
Text2vec
Fast vectorization, topic modeling, distances and GloVe word embeddings in R.
Stars: ✭ 715 (-10.06%)
Mutual labels:  natural-language-processing
Pplm
Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.
Stars: ✭ 674 (-15.22%)
Mutual labels:  natural-language-processing
Ai Job Recommend
国内公司人工智能方向(含机器学习、深度学习、计算机视觉和自然语言处理)岗位的招聘信息(含全职、实习和校招)
Stars: ✭ 679 (-14.59%)
Mutual labels:  natural-language-processing
Machine Learning
머신러닝 입문자 혹은 스터디를 준비하시는 분들에게 도움이 되고자 만든 repository입니다. (This repository is intented for helping whom are interested in machine learning study)
Stars: ✭ 705 (-11.32%)
Mutual labels:  natural-language-processing
Wikipedia2vec
A tool for learning vector representations of words and entities from Wikipedia
Stars: ✭ 655 (-17.61%)
Mutual labels:  natural-language-processing
Youtokentome
Unsupervised text tokenizer focused on computational efficiency
Stars: ✭ 728 (-8.43%)
Mutual labels:  natural-language-processing
Dl Nlp Readings
My Reading Lists of Deep Learning and Natural Language Processing
Stars: ✭ 656 (-17.48%)
Mutual labels:  natural-language-processing
Keras Attention
Visualizing RNNs using the attention mechanism
Stars: ✭ 697 (-12.33%)
Mutual labels:  natural-language-processing
Nlp In Practice
Starter code to solve real world text data problems. Includes: Gensim Word2Vec, phrase embeddings, Text Classification with Logistic Regression, word count with pyspark, simple text preprocessing, pre-trained embeddings and more.
Stars: ✭ 790 (-0.63%)
Mutual labels:  natural-language-processing
Jcseg
Jcseg is a light weight NLP framework developed with Java. Provide CJK and English segmentation based on MMSEG algorithm, With also keywords extraction, key sentence extraction, summary extraction implemented based on TEXTRANK algorithm. Jcseg had a build-in http server and search modules for the latest lucene,solr,elasticsearch
Stars: ✭ 754 (-5.16%)
Mutual labels:  natural-language-processing
Machine learning examples
A collection of machine learning examples and tutorials.
Stars: ✭ 6,466 (+713.33%)
Mutual labels:  natural-language-processing

------ Update September 2018 ------

It's been a year since TorchMoji and DeepMoji were released. We're trying to understand how it's being used such that we can make improvements and design better models in the future.

You can help us achieve this by answering this 4-question Google Form. Thanks for your support!

😇 TorchMoji

Read our blog post about the implementation process here.

TorchMoji is a pyTorch implementation of the DeepMoji model developped by Bjarke Felbo, Alan Mislove, Anders Søgaard, Iyad Rahwan and Sune Lehmann.

This model trained on 1.2 billion tweets with emojis to understand how language is used to express emotions. Through transfer learning the model can obtain state-of-the-art performance on many emotion-related text modeling tasks.

Try the online demo of DeepMoji http://deepmoji.mit.edu! See the paper, blog post or FAQ for more details.

Overview

  • torchmoji/ contains all the underlying code needed to convert a dataset to the vocabulary and use the model.
  • examples/ contains short code snippets showing how to convert a dataset to the vocabulary, load up the model and run it on that dataset.
  • scripts/ contains code for processing and analysing datasets to reproduce results in the paper.
  • model/ contains the pretrained model and vocabulary.
  • data/ contains raw and processed datasets that we include in this repository for testing.
  • tests/ contains unit tests for the codebase.

To start out with, have a look inside the examples/ directory. See score_texts_emojis.py for how to use DeepMoji to extract emoji predictions, encode_texts.py for how to convert text into 2304-dimensional emotional feature vectors or finetune_youtube_last.py for how to use the model for transfer learning on a new dataset.

Please consider citing the paper of DeepMoji if you use the model or code (see below for citation).

Installation

We assume that you're using Python 2.7-3.5 with pip installed.

First you need to install pyTorch (version 0.2+), currently by:

conda install pytorch -c pytorch

At the present stage the model can't make efficient use of CUDA. See details in the Hugging Face blog post.

When pyTorch is installed, run the following in the root directory to install the remaining dependencies:

pip install -e .

This will install the following dependencies:

Then, run the download script to downloads the pretrained torchMoji weights (~85MB) from here and put them in the model/ directory:

python scripts/download_weights.py

Testing

To run the tests, install nose. After installing, navigate to the tests/ directory and run:

cd tests
nosetests -v

By default, this will also run finetuning tests. These tests train the model for one epoch and then check the resulting accuracy, which may take several minutes to finish. If you'd prefer to exclude those, run the following instead:

cd tests
nosetests -v -a '!slow'

Disclaimer

This code has been tested to work with Python 2.7 and 3.5 on Ubuntu 16.04 and macOS Sierra machines. It has not been optimized for efficiency, but should be fast enough for most purposes. We do not give any guarantees that there are no bugs - use the code on your own responsibility!

Contributions

We welcome pull requests if you feel like something could be improved. You can also greatly help us by telling us how you felt when writing your most recent tweets. Just click here to contribute.

License

This code and the pretrained model is licensed under the MIT license.

Benchmark datasets

The benchmark datasets are uploaded to this repository for convenience purposes only. They were not released by us and we do not claim any rights on them. Use the datasets at your responsibility and make sure you fulfill the licenses that they were released with. If you use any of the benchmark datasets please consider citing the original authors.

Citation

@inproceedings{felbo2017,
  title={Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm},
  author={Felbo, Bjarke and Mislove, Alan and S{\o}gaard, Anders and Rahwan, Iyad and Lehmann, Sune},
  booktitle={Conference on Empirical Methods in Natural Language Processing (EMNLP)},
  year={2017}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].