All Projects → Raschka-research-group → Coral Cnn

Raschka-research-group / Coral Cnn

Licence: mit
Rank Consistent Ordinal Regression for Neural Networks with Application to Age Estimation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Coral Cnn

Ensemble Pytorch
A unified ensemble framework for Pytorch to improve the performance and robustness of your deep learning model
Stars: ✭ 153 (-17.74%)
Mutual labels:  deeplearning
Motion Sense
MotionSense Dataset for Human Activity and Attribute Recognition ( time-series data generated by smartphone's sensors: accelerometer and gyroscope)
Stars: ✭ 159 (-14.52%)
Mutual labels:  deeplearning
Awesome Deep Learning Music
List of articles related to deep learning applied to music
Stars: ✭ 2,195 (+1080.11%)
Mutual labels:  deeplearning
Java Deep Learning Cookbook
Code for Java Deep Learning Cookbook
Stars: ✭ 156 (-16.13%)
Mutual labels:  deeplearning
Pythonpark
Python 开源项目之「自学编程之路」,保姆级教程:AI实验室、宝藏视频、数据结构、学习指南、机器学习实战、深度学习实战、网络爬虫、大厂面经、程序人生、资源分享。
Stars: ✭ 4,294 (+2208.6%)
Mutual labels:  deeplearning
Fixy
Amacımız Türkçe NLP literatüründeki birçok farklı sorunu bir arada çözebilen, eşsiz yaklaşımlar öne süren ve literatürdeki çalışmaların eksiklerini gideren open source bir yazım destekleyicisi/denetleyicisi oluşturmak. Kullanıcıların yazdıkları metinlerdeki yazım yanlışlarını derin öğrenme yaklaşımıyla çözüp aynı zamanda metinlerde anlamsal analizi de gerçekleştirerek bu bağlamda ortaya çıkan yanlışları da fark edip düzeltebilmek.
Stars: ✭ 165 (-11.29%)
Mutual labels:  deeplearning
Mariana
The Cutest Deep Learning Framework which is also a wonderful Declarative Language
Stars: ✭ 151 (-18.82%)
Mutual labels:  deeplearning
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-1.08%)
Mutual labels:  deeplearning
Cyclegan Vc2
Voice Conversion by CycleGAN (语音克隆/语音转换): CycleGAN-VC2
Stars: ✭ 158 (-15.05%)
Mutual labels:  deeplearning
Deeplearning4j
Suite of tools for deploying and training deep learning models using the JVM. Highlights include model import for keras, tensorflow, and onnx/pytorch, a modular and tiny c++ library for running math code and a java based math library on top of the core c++ library. Also includes samediff: a pytorch/tensorflow like library for running deep learni…
Stars: ✭ 12,277 (+6500.54%)
Mutual labels:  deeplearning
Best ai paper 2020
A curated list of the latest breakthroughs in AI by release date with a clear video explanation, link to a more in-depth article, and code
Stars: ✭ 2,140 (+1050.54%)
Mutual labels:  deeplearning
Bmw Tensorflow Inference Api Cpu
This is a repository for an object detection inference API using the Tensorflow framework.
Stars: ✭ 158 (-15.05%)
Mutual labels:  deeplearning
Robomaster2018 Seu Opensource
This is the open source project for RoboMaster 2018 contest from Southeast University
Stars: ✭ 169 (-9.14%)
Mutual labels:  deeplearning
Zi2zi
Learning Chinese Character style with conditional GAN
Stars: ✭ 1,988 (+968.82%)
Mutual labels:  deeplearning
Deeplearning4j Examples
Deeplearning4j Examples (DL4J, DL4J Spark, DataVec)
Stars: ✭ 2,215 (+1090.86%)
Mutual labels:  deeplearning
Ai Art
PyTorch (and PyTorch Lightning) implementation of Neural Style Transfer, Pix2Pix, CycleGAN, and Deep Dream!
Stars: ✭ 153 (-17.74%)
Mutual labels:  deeplearning
Libra
Ergonomic machine learning for everyone.
Stars: ✭ 1,925 (+934.95%)
Mutual labels:  deeplearning
Clearml Server
ClearML - Auto-Magical Suite of tools to streamline your ML workflow. Experiment Manager, ML-Ops and Data-Management
Stars: ✭ 186 (+0%)
Mutual labels:  deeplearning
Deep white balance
Reference code for the paper: Deep White-Balance Editing, CVPR 2020 (Oral). Our method is a deep learning multi-task framework for white-balance editing.
Stars: ✭ 184 (-1.08%)
Mutual labels:  deeplearning
Speech Emotion Recognition
Speaker independent emotion recognition
Stars: ✭ 169 (-9.14%)
Mutual labels:  deeplearning

Rank-consistent Ordinal Regression for Neural Networks

This repository contains the PyTorch model code for the paper


This GitHub repository contains the code files and training logs used in the paper. If you are primarily interested in using CORAL, a PyTorch library with Tutorials can be found here:


PyTorch Model Code

Note that the model code across datasets is identical for the different datasets, however, we hard coded the file paths to the datasets at the top of the file and using dataloaders specific to the corresponding dataset organization. You likely need to change the file paths in the scripts depending on where you save the image datasets and label files if you wish to run the code.

All code was run on PyTorch 1.5 and Python 3.7, and we do not guarantee upward and downward compatibility to other PyTorch and Python versions.

The model code can be found in the [./model-code](./model-code) subdirectory, and the code files are labeled using the scheme

<dataset>-<loss>.py
  • <dataset> refers to either AFAD (afad), MORPH-2 (morph), or CACD (cacd).

  • <loss> refers to either CORAL (coral), ordinal regression as in Niu et al. (ordinal), or cross-entropy (ce).

Example

The following code trains coral on the afad dataset:

python afad-coral.py --seed 1 --cuda 0 --outpath afad-model1
  • --seed <int>: Integer for the random seed; used for training set shuffling and the model weight initialization (note that CUDA convolutions are not fully deterministic).

  • --cuda <int>: The CUDA device number of the GPU to be used for training (--cuda 0 refers to the 1st GPU).

  • --outpath <directory>: Path for saving the training log (training.log) and the parameters of the trained model (model.pt).

Here is an overview of the differences between a regular CNN and a CORAL-CNN:


(Click to see a high resolution version.)


Training Logs and Trained Models from the Paper

We share all training logs in this GitHub repository under the ./experiment-logs subdirectory. Due to the large file-size (85 Mb per model), we could not share the trained models on GitHub; however, all trained models can be downloaded from Google Drive via the following link: https://drive.google.com/drive/folders/168ijUQyvGLhHoQUQMlFS2fVt2p5ZV2bD?usp=sharing.

Image files

The image files of the face image datasets are available from the following websites:

Data preprocessing code

We provide the dataset preprocessing code that we used to prepare the CACD and MORPH-2 datasets as described in the paper. The code is located in the [./datasets/image-preprocessing-code](./datasets/image-preprocessing-code) subdirectory. AFAD did not need further preprocessing.

Labels and train/test splits

We provide the age labels (obtained from the orginal dataset resources) and train/test splits we used in CSV format located in the ./datasets/train-test-csv subdirectory.

  • CACD: labels 0-48 correspond to ages 14-62
  • AFAD: labels 0-25 correspond to ages 15-40
  • MORPH-2: labels 0-54 correspond to ages 16-70

Using Trained Models

We share the pre-trained models from the paper that can be used to make predictions on AFAD, MORPH-2, or CACD images. Please see the README in the single-image-prediction__w-pretrained-models subdirectory for details.


Implementations for Other Deep Learning Frameworks

Porting Guide

Our models were originally implemented in PyTorch 1.5. We provide a recipe for porting the code is provided at coral-implementation-recipe.ipynb. Also see the the file-diff comparing CORAL with regular CNN.

Keras

A Keras port of this code was recently developed and made available at https://github.com/ck37/coral-ordinal.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].