All Projects → UttaranB127 → STEP

UttaranB127 / STEP

Licence: MIT license
Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to STEP

hfusion
Multimodal sentiment analysis using hierarchical fusion with context modeling
Stars: ✭ 42 (+7.69%)
Mutual labels:  emotion-detection, emotion-recognition
Emotion and Polarity SO
An emotion classifier of text containing technical content from the SE domain
Stars: ✭ 74 (+89.74%)
Mutual labels:  emotion-detection, emotion-recognition
GaitRecognition
Gait demo for tutorial of ICPR 2016
Stars: ✭ 61 (+56.41%)
Mutual labels:  gait, gait-recognition
XED
XED multilingual emotion datasets
Stars: ✭ 34 (-12.82%)
Mutual labels:  emotion-detection, emotion-recognition
Hemuer
An AI Tool to record expressions of users as they watch a video and then visualize the funniest parts of it!
Stars: ✭ 22 (-43.59%)
Mutual labels:  emotion-detection, emotion-recognition
pytorch-GaitGAN
GaitGAN: Invariant Gait Feature Extraction Using Generative Adversarial Networks
Stars: ✭ 45 (+15.38%)
Mutual labels:  gait, gait-analysis
AIML-Human-Attributes-Detection-with-Facial-Feature-Extraction
This is a Human Attributes Detection program with facial features extraction. It detects facial coordinates using FaceNet model and uses MXNet facial attribute extraction model for extracting 40 types of facial attributes. This solution also detects Emotion, Age and Gender along with facial attributes.
Stars: ✭ 48 (+23.08%)
Mutual labels:  emotion-detection, emotion-recognition
sklearn-audio-classification
An in-depth analysis of audio classification on the RAVDESS dataset. Feature engineering, hyperparameter optimization, model evaluation, and cross-validation with a variety of ML techniques and MLP
Stars: ✭ 31 (-20.51%)
Mutual labels:  emotion-detection, emotion-recognition
emotic
PyTorch implementation of Emotic CNN methodology to recognize emotions in images using context information.
Stars: ✭ 57 (+46.15%)
Mutual labels:  emotion-detection, emotion-recognition
m3f.pytorch
PyTorch code for "M³T: Multi-Modal Multi-Task Learning for Continuous Valence-Arousal Estimation"
Stars: ✭ 20 (-48.72%)
Mutual labels:  affective-computing, emotion-recognition
TraND
This is the code for the paper "Jinkai Zheng, Xinchen Liu, Chenggang Yan, Jiyong Zhang, Wu Liu, Xiaoping Zhang and Tao Mei: TraND: Transferable Neighborhood Discovery for Unsupervised Cross-domain Gait Recognition. ISCAS 2021" (Best Paper Award - Honorable Mention)
Stars: ✭ 32 (-17.95%)
Mutual labels:  gait, gait-recognition
Representation Learning on Graphs with Jumping Knowledge Networks
Representation Learning on Graphs with Jumping Knowledge Networks
Stars: ✭ 31 (-20.51%)
Mutual labels:  graph-convolutional-networks
GaitAnalysisToolKit
Tools for the Cleveland State Human Motion and Control Lab
Stars: ✭ 85 (+117.95%)
Mutual labels:  gait
Twords
Twitter Word Frequency Analysis
Stars: ✭ 17 (-56.41%)
Mutual labels:  dataset-generation
Resnet-Emotion-Recognition
Identifies emotion(s) from user facial expressions
Stars: ✭ 21 (-46.15%)
Mutual labels:  emotion-recognition
linguistic-style-transfer-pytorch
Implementation of "Disentangled Representation Learning for Non-Parallel Text Style Transfer(ACL 2019)" in Pytorch
Stars: ✭ 55 (+41.03%)
Mutual labels:  variational-autoencoder
AGHMN
Implementation of the paper "Real-Time Emotion Recognition via Attention Gated Hierarchical Memory Network" in AAAI-2020.
Stars: ✭ 25 (-35.9%)
Mutual labels:  emotion-recognition
graphml-tutorials
Tutorials for Machine Learning on Graphs
Stars: ✭ 125 (+220.51%)
Mutual labels:  graph-convolutional-networks
PaiConvMesh
Official repository for the paper "Learning Local Neighboring Structure for Robust 3D Shape Representation"
Stars: ✭ 19 (-51.28%)
Mutual labels:  graph-convolutional-networks
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-43.59%)
Mutual labels:  variational-autoencoder

This is the official implementation of the paper STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits. Please use the following citation if you find our work uesful:

@inproceedings{bhattacharya2020step,
author = {Bhattacharya, Uttaran and Mittal, Trisha and Chandra, Rohan and Randhavane, Tanmay and Bera, Aniket and Manocha, Dinesh},
title = {STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits},
year = {2020},
publisher = {AAAI Press},
booktitle = {Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence},
pages = {1342–1350},
numpages = {9},
series = {AAAI’20}
}

We have also released the Emotion-Gait dataset with this code, which is available for download here: https://go.umd.edu/emotion-gait.

  1. generator_cvae is the generator.

  2. classifier_stgcn_real_only is the baseline classifier using only the real 342 gaits.

  3. classifier_stgcn_real_and_synth is the baseline classifier using both real 342 and N synthetic gaits.

  4. clasifier_hybrid is the hybrid classifier using both deep and physiologically-motivated features.

  5. compute_aff_features consists of the set of scripts to compute the affective features from 16-joint pose sequences. Calling main.py with the correct data path computes the features, and save them in the affectiveFeatures<f_type>.h5 file, where f_type is the desired type of features:

    • '' original data (default)
    • 4DCVAEGCN data generated by the CVAE.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].