All Projects → Tandon-A → emotic

Tandon-A / emotic

Licence: MIT License
PyTorch implementation of Emotic CNN methodology to recognize emotions in images using context information.

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to emotic

emotic
Code repo for the EMOTIC dataset
Stars: ✭ 93 (+63.16%)
Mutual labels:  emotion-analysis, emotion-recognition, emotic
hfusion
Multimodal sentiment analysis using hierarchical fusion with context modeling
Stars: ✭ 42 (-26.32%)
Mutual labels:  emotion-analysis, emotion-detection, emotion-recognition
AIML-Human-Attributes-Detection-with-Facial-Feature-Extraction
This is a Human Attributes Detection program with facial features extraction. It detects facial coordinates using FaceNet model and uses MXNet facial attribute extraction model for extracting 40 types of facial attributes. This solution also detects Emotion, Age and Gender along with facial attributes.
Stars: ✭ 48 (-15.79%)
Mutual labels:  emotion-detection, emotion-recognition
sklearn-audio-classification
An in-depth analysis of audio classification on the RAVDESS dataset. Feature engineering, hyperparameter optimization, model evaluation, and cross-validation with a variety of ML techniques and MLP
Stars: ✭ 31 (-45.61%)
Mutual labels:  emotion-detection, emotion-recognition
Hemuer
An AI Tool to record expressions of users as they watch a video and then visualize the funniest parts of it!
Stars: ✭ 22 (-61.4%)
Mutual labels:  emotion-detection, emotion-recognition
EmotiW2018
No description or website provided.
Stars: ✭ 83 (+45.61%)
Mutual labels:  emotion-analysis, emotion-recognition
Text tone analyzer
Система, анализирующая тональность текстов и высказываний.
Stars: ✭ 15 (-73.68%)
Mutual labels:  emotion-analysis, emotion-detection
Emotion and Polarity SO
An emotion classifier of text containing technical content from the SE domain
Stars: ✭ 74 (+29.82%)
Mutual labels:  emotion-detection, emotion-recognition
STEP
Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (-31.58%)
Mutual labels:  emotion-detection, emotion-recognition
XED
XED multilingual emotion datasets
Stars: ✭ 34 (-40.35%)
Mutual labels:  emotion-detection, emotion-recognition
homebridge-yeelight-platform
Homebridge plugin for Yeelight Lights supporting Scenes/Moods/Color Flow/Custom Presets/Music Flow/Night Mode
Stars: ✭ 53 (-7.02%)
Mutual labels:  scene
soxan
Wav2Vec for speech recognition, classification, and audio classification
Stars: ✭ 113 (+98.25%)
Mutual labels:  emotion-recognition
EmotionChallenge
Source code for 1st winner of face micro-emotion competition, FG 2017.
Stars: ✭ 37 (-35.09%)
Mutual labels:  emotion-recognition
Age-gender-and-emotion-recognition
3 networks to recognition age,gender and emotion
Stars: ✭ 29 (-49.12%)
Mutual labels:  emotion-recognition
WarezBot
Public Version of Discord bot for scene release
Stars: ✭ 30 (-47.37%)
Mutual labels:  scene
OpnEco
OpnEco is a Python3 project developed to aid content writers throughout the content writing process. By content writers, for content writers.
Stars: ✭ 18 (-68.42%)
Mutual labels:  emotion-detection
NeewerLite
NeewerLite is an un-official Neewer LED light control app for macOS.
Stars: ✭ 54 (-5.26%)
Mutual labels:  scene
dissertation
🎓 📜 This repository holds my final year and dissertation project during my time at the University of Lincoln titled 'Deep Learning for Emotion Recognition in Cartoons'.
Stars: ✭ 22 (-61.4%)
Mutual labels:  emotion-recognition
HiGRUs
Implementation of the paper "Hierarchical GRU for Utterance-level Emotion Recognition" in NAACL-2019.
Stars: ✭ 60 (+5.26%)
Mutual labels:  emotion-recognition
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (+38.6%)
Mutual labels:  emotion-recognition

Emotic

Humans use their facial features or expressions to convey how they feel, such as a person may smile when happy and scowl when angry. Historically, computer vision research has focussed on analyzing and learning these facial features to recognize emotions. However, these facial features are not universal and vary extensively across cultures and situations.

Fig 1: a) (Facial feature) The person looks angry or in pain b) (Whole scene) The person looks elated.

A scene context, as shown in the figure above, can provide additional information about the situations. This project explores the use of context in recognizing emotions in images.

Pipeline

The project uses the EMOTIC dataset and follows the methodology as introduced in the paper 'Context based emotion recognition using EMOTIC dataset'.

Pipeline

Fig 2: Model Pipeline (Image source)

Two feature extraction modules first extract features over an image. These features are then used by a third module to predict the continuous dimensions (valence, arousal and dominance) and the discrete emotion categories.

Usage

Download the Emotic dataset and annotations and prepare the directory following the below structure:

├── ...
│   ├── emotic
│   |    ├── ade20k
│   |    ├── emodb_small
│   |    ├── framesdb
│   |    ├── mscoco 
│   ├── Annotations
│   |    ├── Annotations.mat
  1. To convert annotations from mat object to csv files and preprocess the data:
> python mat2py.py --data_dir proj/data/emotic19 --generate_npy
  • data_dir: Path of the directory containing the emotic and annotations folder as described in the above data directory structure.
  • generate_npy: Argument to specify to generate npy files (later used for training and testing) along with CSV files. If not passed only CSV files are generated.
  1. To train the model:
> python main.py --mode train --data_path proj/data/emotic_pre --experiment_path proj/debug_exp
  • mode: Mode to run the main file.
  • data_path: Path of the directory which contains the preprocessed data and CSV files generated in the first step.
  • experiment_path: Path of the experiment directory. The directory will save the results, models and logs.
  1. To test the model:
> python main.py --mode test --data_path proj/data/emotic_pre --experiment_path proj/debug_exp
  • mode: Mode to run the main file.
  • data_path: Path of the directory which contains the preprocessed data and CSV files generated in the first step.
  • experiment_path: Path of the experiment directory. Models stored in the the directory are used for testing.
  1. To perform inference:
> python main.py --mode inference --inference_file proj/debug_exp/inference_file.txt --experiment_path proj/debug_exp
  • mode: Mode to run the main file.
  • inference_file: Text file specifying images to perform inference. A row is: 'full_path_of_image x1 y1 x2 y2', where (x1,y1) and (x2,y2) specify the bounding box. Refer sample_inference_list.txt.
  • experiment_path: Path of the experiment directory. Models stored in the the directory are used for inference.

You can also train and test models on Emotic dataset by using the Colab_train_emotic notebook. Open In Colab

The trained models and thresholds to use for inference purposes are availble here.

Results

Result GIF 1

Acknowledgements

Context Based Emotion Recognition using Emotic Dataset

Ronak Kosti, Jose Alvarez, Adria Recasens, Agata Lapedriza
[Paper] [Project Webpage] [Authors' Implementation]

@article{kosti2020context,
  title={Context based emotion recognition using emotic dataset},
  author={Kosti, Ronak and Alvarez, Jose M and Recasens, Adria and Lapedriza, Agata},
  journal={arXiv preprint arXiv:2003.13401},
  year={2020}
}

Author

Abhishek Tandon

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].