All Projects → ESanchezLozano → Action-Units-Heatmaps

ESanchezLozano / Action-Units-Heatmaps

Licence: other
Code for BMVC paper "Joint Action Unit localisation and intensity estimation through heatmap regression"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Action-Units-Heatmaps

webCamEmocognizer
A cool emotion detector using your laptop/desktop webcam
Stars: ✭ 57 (-28.75%)
Mutual labels:  facial-expression-recognition
facial-expression-recognition
Facial Expression Recognition Using CNN and Haar-Cascade
Stars: ✭ 44 (-45%)
Mutual labels:  facial-expression-recognition
Amend-Representation-Module
ARM - Official PyTorch Implementation
Stars: ✭ 53 (-33.75%)
Mutual labels:  facial-expression-recognition
Mathai
一个拍照做题程序。输入一张包含数学计算题的图片,输出识别出的数学计算式以及计算结果。This is a mathematic expression recognition project.
Stars: ✭ 1,965 (+2356.25%)
Mutual labels:  expression-recognition
3D-facial-reconstruction
3D facial reconstruction, expression recognition and transfer from monocular RGB images with a deep convolutional auto-encoding neural network
Stars: ✭ 13 (-83.75%)
Mutual labels:  expression-recognition
Facial-Expression-Recognition
A Pytorch Implementation of FER( facial expression recognition )
Stars: ✭ 27 (-66.25%)
Mutual labels:  facial-expression-recognition
ferattention
FERAtt: Facial Expression Recognition with Attention Net
Stars: ✭ 69 (-13.75%)
Mutual labels:  facial-expression-recognition
Facial-Expression-Recognition
Facial-Expression-Recognition using tensorflow
Stars: ✭ 19 (-76.25%)
Mutual labels:  facial-expression-recognition
Hemuer
An AI Tool to record expressions of users as they watch a video and then visualize the funniest parts of it!
Stars: ✭ 22 (-72.5%)
Mutual labels:  facial-expression-recognition
FMPN-FER
Official PyTorch Implementation of 'Facial Motion Prior Networks for Facial Expression Recognition', VCIP 2019, Oral
Stars: ✭ 76 (-5%)
Mutual labels:  facial-expression-recognition
AIML-Human-Attributes-Detection-with-Facial-Feature-Extraction
This is a Human Attributes Detection program with facial features extraction. It detects facial coordinates using FaceNet model and uses MXNet facial attribute extraction model for extracting 40 types of facial attributes. This solution also detects Emotion, Age and Gender along with facial attributes.
Stars: ✭ 48 (-40%)
Mutual labels:  facial-expression-recognition
Emotion-Investigator
An Exciting Deep Learning-based Flask web app that predicts the Facial Expressions of users and also does Graphical Visualization of the Expressions.
Stars: ✭ 44 (-45%)
Mutual labels:  facial-expression-recognition
fer
Facial Expression Recognition
Stars: ✭ 32 (-60%)
Mutual labels:  facial-expression-recognition
facial-expression-recognition
The main purpose of the project - recognition of emotions based on facial expressions. Cohn-Kanade data set (http://www.pitt.edu/~emotion/ck-spread.htm) is used for explorations and training
Stars: ✭ 60 (-25%)
Mutual labels:  facial-expression-recognition
MA-Net
“Learning Deep Global Multi-scale and Local Attention Features for Facial Expression Recognition in the Wild”, IEEE T-IP, 2021.
Stars: ✭ 53 (-33.75%)
Mutual labels:  facial-expression-recognition
emotion-recognition-GAN
This project is a semi-supervised approach to detect emotions on faces in-the-wild using GAN
Stars: ✭ 20 (-75%)
Mutual labels:  action-units
ganimation replicate
An Out-of-the-Box Replication of GANimation using PyTorch, pretrained weights are available!
Stars: ✭ 165 (+106.25%)
Mutual labels:  action-units
openFACS
openFACS : an open source FACS-based 3D face animation system
Stars: ✭ 70 (-12.5%)
Mutual labels:  facs

Joint Action Unit localisation and intensity estimation

This is a built-in class for Action Unit intensity estimation with heatmap regression, adapted from the code used for the BMVC paper "Joint Action Unit localisation and intensity estimation through heatmap regression" (see citation below)

Alt Text

Alt Text

This class takes an image and returns the heatmaps and the AU predictions from them. In order to create a standalone class the points are detected using the dlib facial landmark detector. This will be shortly replaced by the iCCR tracker, whose Python implementation is underway (you can check the Matlab code here).

An example of usage is included in the first release. Full scripts for folder and csv reading will follow soon.

The Hourglass model has been kindly adapted from the FAN network. You can check Adrian's amazing code here

Requirements

dlib --> pip install dlib Link

OpenCV --> pip install cv2 Link

PyTorch --> follow the steps in https://pytorch.org/

It also requires scipy and matplotlib, and the Python version to be 3.X

pip install dlib
pip install cv2

Use

To use the code you need to download the dlib facial landmark detector from here and add it to your folder.

This all you need to run the detector (the visualisation in this script is really poor, I will work on improving it)

import AUmaps
import glob
import dlib
import matplotlib.pyplot as plt
AUdetector = AUmaps.AUdetector('shape_predictor_68_face_landmarks.dat',enable_cuda=False)
path_imgs = 'example_video'
files = sorted(glob.glob(path_imgs + '/*.png'))
fig = plt.figure(figsize=plt.figaspect(.5))
for names in files:
    print(names)
    img = dlib.load_rgb_image(names)
    pred,map,img = AUdetector.detectAU(img)
    for j in range(0,5):
        resized_map = dlib.resize_image(map[j,:,:].cpu().data.numpy(),rows=256,cols=256)
        ax = fig.add_subplot(5,2,2*j+1)
        ax.imshow(img)
        ax.axis('off')
        ax = fig.add_subplot(5, 2, 2*j+2)
        ax.imshow(resized_map)
        ax.axis('off')
    plt.pause(.1)
    plt.draw()

Contributions

All contributions are welcome

Citation

@inproceedings{sanchez2018bmvc,
  title = {Joint Action Unit localisation and intensity estimation through heatmap regression},
  author = {Enrique Sánchez-Lozano and Georgios Tzimiropoulos and Michel Valstar},
  booktitle = {BMVC},
  year = 2018
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].