All Projects → polaroidz → Multitask_sentiment_analysis

polaroidz / Multitask_sentiment_analysis

Multitask Deep Learning for Sentiment Analysis using Character-Level Language Model, Bi-LSTMs for POS Tag, Chunking and Unsupervised Dependency Parsing. Inspired by this great article https://arxiv.org/abs/1611.01587

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Multitask sentiment analysis

Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+3350.54%)
Mutual labels:  natural-language-processing, lstm, recurrent-neural-networks
Sangita
A Natural Language Toolkit for Indian Languages
Stars: ✭ 43 (-53.76%)
Mutual labels:  natural-language-processing, lstm, recurrent-neural-networks
Ner Lstm
Named Entity Recognition using multilayered bidirectional LSTM
Stars: ✭ 532 (+472.04%)
Mutual labels:  natural-language-processing, lstm, recurrent-neural-networks
Pytorch Pos Tagging
A tutorial on how to implement models for part-of-speech tagging using PyTorch and TorchText.
Stars: ✭ 96 (+3.23%)
Mutual labels:  natural-language-processing, lstm, recurrent-neural-networks
Chicksexer
A Python package for gender classification.
Stars: ✭ 64 (-31.18%)
Mutual labels:  natural-language-processing, lstm, recurrent-neural-networks
Language Modelling
Generating Text using Deep Learning in Python - LSTM, RNN, Keras
Stars: ✭ 38 (-59.14%)
Mutual labels:  natural-language-processing, lstm
Char Rnn Keras
TensorFlow implementation of multi-layer recurrent neural networks for training and sampling from texts
Stars: ✭ 40 (-56.99%)
Mutual labels:  lstm, recurrent-neural-networks
Deepseqslam
The Official Deep Learning Framework for Route-based Place Recognition
Stars: ✭ 49 (-47.31%)
Mutual labels:  lstm, recurrent-neural-networks
Sentiment Analysis Nltk Ml Lstm
Sentiment Analysis on the First Republic Party debate in 2016 based on Python,NLTK and ML.
Stars: ✭ 61 (-34.41%)
Mutual labels:  lstm, recurrent-neural-networks
Spago
Self-contained Machine Learning and Natural Language Processing library in Go
Stars: ✭ 854 (+818.28%)
Mutual labels:  natural-language-processing, lstm
Tensorflow Lstm Sin
TensorFlow 1.3 experiment with LSTM (and GRU) RNNs for sine prediction
Stars: ✭ 52 (-44.09%)
Mutual labels:  lstm, recurrent-neural-networks
Gdax Orderbook Ml
Application of machine learning to the Coinbase (GDAX) orderbook
Stars: ✭ 60 (-35.48%)
Mutual labels:  lstm, recurrent-neural-networks
Reading comprehension tf
Machine Reading Comprehension in Tensorflow
Stars: ✭ 37 (-60.22%)
Mutual labels:  natural-language-processing, recurrent-neural-networks
Lstmvis
Visualization Toolbox for Long Short Term Memory networks (LSTMs)
Stars: ✭ 959 (+931.18%)
Mutual labels:  lstm, recurrent-neural-networks
Named Entity Recognition
name entity recognition with recurrent neural network(RNN) in tensorflow
Stars: ✭ 20 (-78.49%)
Mutual labels:  natural-language-processing, recurrent-neural-networks
Image Captioning
Image Captioning: Implementing the Neural Image Caption Generator with python
Stars: ✭ 52 (-44.09%)
Mutual labels:  lstm, recurrent-neural-networks
Lstm Ctc Ocr
using rnn (lstm or gru) and ctc to convert line image into text, based on torch7 and warp-ctc
Stars: ✭ 70 (-24.73%)
Mutual labels:  lstm, recurrent-neural-networks
Bitcoin Price Prediction Using Lstm
Bitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network
Stars: ✭ 67 (-27.96%)
Mutual labels:  lstm, recurrent-neural-networks
Dialogue Understanding
This repository contains PyTorch implementation for the baseline models from the paper Utterance-level Dialogue Understanding: An Empirical Study
Stars: ✭ 77 (-17.2%)
Mutual labels:  natural-language-processing, lstm
Deep Learning Time Series
List of papers, code and experiments using deep learning for time series forecasting
Stars: ✭ 796 (+755.91%)
Mutual labels:  lstm, recurrent-neural-networks

Deep Multi-Task Sentiment Analysis

Why Multi-Task?

Recently years have shown amazing results in supervised learning due the advent of Deep Neural Networks and Gradient Descent Implementations, however, most of them were limited to one-task learning. Where the model would specialized at only objective during training.

Because most of real-world problems are composed from various sub-tasks, it would make sense to make sense this distinction on the model itself.

However, methods of effectivelly training a single model at various tasks at once aren't very consolidated yet. Making this an active area of research and the main motivation of the paper that inspired this post.

The paper A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks is a groundbreaking proposal for unification and joint training of many common Natural Language Processing tasks into a single Deep Learning model.

Architecture

Architecture Diagram

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].