All Projects → zengqunzhao → MA-Net

zengqunzhao / MA-Net

Licence: MIT license
“Learning Deep Global Multi-scale and Local Attention Features for Facial Expression Recognition in the Wild”, IEEE T-IP, 2021.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to MA-Net

Action-Units-Heatmaps
Code for BMVC paper "Joint Action Unit localisation and intensity estimation through heatmap regression"
Stars: ✭ 80 (+50.94%)
Mutual labels:  facial-expression-recognition
webCamEmocognizer
A cool emotion detector using your laptop/desktop webcam
Stars: ✭ 57 (+7.55%)
Mutual labels:  facial-expression-recognition
facial-expression-recognition
Facial Expression Recognition Using CNN and Haar-Cascade
Stars: ✭ 44 (-16.98%)
Mutual labels:  facial-expression-recognition
Amend-Representation-Module
ARM - Official PyTorch Implementation
Stars: ✭ 53 (+0%)
Mutual labels:  facial-expression-recognition
Facial-Expression-Recognition
A Pytorch Implementation of FER( facial expression recognition )
Stars: ✭ 27 (-49.06%)
Mutual labels:  facial-expression-recognition
ferattention
FERAtt: Facial Expression Recognition with Attention Net
Stars: ✭ 69 (+30.19%)
Mutual labels:  facial-expression-recognition
Facial-Expression-Recognition
Facial-Expression-Recognition using tensorflow
Stars: ✭ 19 (-64.15%)
Mutual labels:  facial-expression-recognition
Hemuer
An AI Tool to record expressions of users as they watch a video and then visualize the funniest parts of it!
Stars: ✭ 22 (-58.49%)
Mutual labels:  facial-expression-recognition
FMPN-FER
Official PyTorch Implementation of 'Facial Motion Prior Networks for Facial Expression Recognition', VCIP 2019, Oral
Stars: ✭ 76 (+43.4%)
Mutual labels:  facial-expression-recognition
AIML-Human-Attributes-Detection-with-Facial-Feature-Extraction
This is a Human Attributes Detection program with facial features extraction. It detects facial coordinates using FaceNet model and uses MXNet facial attribute extraction model for extracting 40 types of facial attributes. This solution also detects Emotion, Age and Gender along with facial attributes.
Stars: ✭ 48 (-9.43%)
Mutual labels:  facial-expression-recognition
Emotion-Investigator
An Exciting Deep Learning-based Flask web app that predicts the Facial Expressions of users and also does Graphical Visualization of the Expressions.
Stars: ✭ 44 (-16.98%)
Mutual labels:  facial-expression-recognition
fer
Facial Expression Recognition
Stars: ✭ 32 (-39.62%)
Mutual labels:  facial-expression-recognition
facial-expression-recognition
The main purpose of the project - recognition of emotions based on facial expressions. Cohn-Kanade data set (http://www.pitt.edu/~emotion/ck-spread.htm) is used for explorations and training
Stars: ✭ 60 (+13.21%)
Mutual labels:  facial-expression-recognition

MA-Net

Zengqun Zhao, Qingshan Liu, Shanmin Wang. "Learning Deep Global Multi-scale and Local Attention Features for Facial Expression Recognition in the Wild". IEEE Transactions on Image Processing.

Requirements

  • Python >= 3.6
  • PyTorch >= 1.2
  • torchvision >= 0.4.0

Training

  • Step 1: download basic emotions dataset of RAF-DB, and make sure it have the structure like following:
./RAF-DB/
         train/
               0/
                 train_09748.jpg
                 ...
                 train_12271.jpg
               1/
               ...
               6/
         test/
              0/
              ...
              6/

[Note] 0: Neutral; 1: Happiness; 2: Sadness; 3: Surprise; 4: Fear; 5: Disgust; 6: Anger
  • Step 2: download pre-trained model from Google Drive, and put it into ./checkpoint.

  • Step 3: change data_path in main.py to your path

  • Step 4: run python main.py

Citation

@article{zhao2021learning,
  title={Learning Deep Global Multi-scale and Local Attention Features for Facial Expression Recognition in the Wild},
  author={Zhao, Zengqun and Liu, Qingshan and Wang, Shanmin},
  journal={IEEE Transactions on Image Processing},
  volume={30},
  pages={6544-6556},
  year={2021},
  publisher={IEEE}
}

Note

The samples' number of CAER-S dataset employed in our work should be: all (69,982 samples), training set (48,995 samples), and test set (20,987 samples). We apologize for the typos in our paper.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].