All Projects → 30stomercury → Interaction-Aware-Attention-Network

30stomercury / Interaction-Aware-Attention-Network

Licence: other
[ICASSP19] An Interaction-aware Attention Network for Speech Emotion Recognition in Spoken Dialogs

Programming Languages

python
139335 projects - #7 most used programming language
Dockerfile
14818 projects
shell
77523 projects

Projects that are alternatives of or similar to Interaction-Aware-Attention-Network

icassp2019-latex-template
ICASSP 2019 official Latex template
Stars: ✭ 21 (-34.37%)
Mutual labels:  icassp, icassp-2019
soxan
Wav2Vec for speech recognition, classification, and audio classification
Stars: ✭ 113 (+253.13%)
Mutual labels:  emotion-recognition, speech-emotion-recognition
speech-emotion-recognition
Speaker independent emotion recognition
Stars: ✭ 269 (+740.63%)
Mutual labels:  emotion-recognition, speech-emotion-recognition
EmotionalConversionStarGAN
This repository contains code to replicate results from the ICASSP 2020 paper "StarGAN for Emotional Speech Conversion: Validated by Data Augmentation of End-to-End Emotion Recognition".
Stars: ✭ 92 (+187.5%)
Mutual labels:  emotion-recognition, icassp
RECCON
This repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.
Stars: ✭ 126 (+293.75%)
Mutual labels:  emotion-recognition
Emotion-Recognition
Emotion recognition from EEG and physiological signals using deep neural networks
Stars: ✭ 35 (+9.38%)
Mutual labels:  emotion-recognition
facial-expression-recognition
Facial Expression Recognition Using CNN and Haar-Cascade
Stars: ✭ 44 (+37.5%)
Mutual labels:  emotion-recognition
Mimic Me CV Game
This repo hold the code of a simple, fun game built using Affectiva's Emotion-as-a-Service API. An Emoji is shown on the screen and one has to mimic the emoji to score points.
Stars: ✭ 20 (-37.5%)
Mutual labels:  emotion-recognition
AGHMN
Implementation of the paper "Real-Time Emotion Recognition via Attention Gated Hierarchical Memory Network" in AAAI-2020.
Stars: ✭ 25 (-21.87%)
Mutual labels:  emotion-recognition
Resnet-Emotion-Recognition
Identifies emotion(s) from user facial expressions
Stars: ✭ 21 (-34.37%)
Mutual labels:  emotion-recognition
Openpose-based-GUI-for-Realtime-Pose-Estimate-and-Action-Recognition
GUI based on the python api of openpose in windows using cuda10 and cudnn7. Support body , hand, face keypoints estimation and data saving. Realtime gesture recognition is realized through two-layer neural network based on the skeleton collected from the gui.
Stars: ✭ 69 (+115.63%)
Mutual labels:  emotion-recognition
attentive-modality-hopping-for-SER
TensorFlow implementation of "Attentive Modality Hopping for Speech Emotion Recognition," ICASSP-20
Stars: ✭ 25 (-21.87%)
Mutual labels:  speech-emotion-recognition
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+553.13%)
Mutual labels:  icassp
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+171.88%)
Mutual labels:  emotion-recognition
wavenet-classifier
Keras Implementation of Deepmind's WaveNet for Supervised Learning Tasks
Stars: ✭ 54 (+68.75%)
Mutual labels:  speech-emotion-recognition
emotic
Code repo for the EMOTIC dataset
Stars: ✭ 93 (+190.63%)
Mutual labels:  emotion-recognition
fer
Facial Expression Recognition
Stars: ✭ 32 (+0%)
Mutual labels:  emotion-recognition
emotion-and-gender-classification
2 networks to recognition gender and emotion; face detection using Opencv or Mtcnn
Stars: ✭ 21 (-34.37%)
Mutual labels:  emotion-recognition
emotion-recognition-GAN
This project is a semi-supervised approach to detect emotions on faces in-the-wild using GAN
Stars: ✭ 20 (-37.5%)
Mutual labels:  emotion-recognition
open-speech-corpora
💎 A list of accessible speech corpora for ASR, TTS, and other Speech Technologies
Stars: ✭ 841 (+2528.13%)
Mutual labels:  speech-emotion-recognition

Interaction-aware Attention Network

Model Overview

Screen Shot 2019-10-21 at 5 11 24 PM

Data:

Data disciptions of IEMOCAP please refer to here.

Requirements

Some required libraries:

python                   >=3.6   
tensorflow-gpu           1.11.0
joblib   		 0.13.0
pandas                   0.22.0
scikit-learn             0.19.1
numpy			 1.15.3

Code:

codes descriptions
data.py Includes batch generator & data generator.
model.py main codes.
hyparams.py hyperparameters
script_train.py testing scripts
script_test.py training scripts

To evaluate under realistic scenarios of our model, we adopt leave-one-session-out cross validation.

Run:

For feature pooling to reduce computational cost:

python3 pool_feats.py --input_file INPUT_FILE --output_file OUTPUT_FILE --feat_dim FEAT_DIM --step STEP --max_size MAX_SIZE

For training:

python3 script_train.py --seq_dim SEQ_DIM \
                        --atten_size ATTEN_SIZE \
                        --batch_size BATCH_SIZE \
                        --model_dir MODEL_DIR \
                        --record_file outputs/RECORD_FILE.json \
                        --feat_dir data/XXX.pkl

For testing

python3 script_test.py  --result_file outputs/RECORD_FILE.json --feat_dir data/XXX.pkl

I include whole process and hyperparameters in a script:

sh run.sh

Citation

@inproceedings{yeh2019interaction,
  title={An interaction-aware attention network for speech emotion recognition in spoken dialogs},
  author={Yeh, Sung-Lin and Lin, Yun-Shao and Lee, Chi-Chun},
  booktitle={ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  pages={6685--6689},
  year={2019},
  organization={IEEE}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].