All Projects → oawiles → Fab Net

oawiles / Fab Net

Licence: mit
Pytorch code for BMVC 2018 paper

Projects that are alternatives of or similar to Fab Net

Equalareacartogram
Converts a Shapefile, GeoJSON, or CSV to an equal area cartogram
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Vkapi Course
Курс по Python по работе с VK API
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Class2021spring
Stars: ✭ 69 (+0%)
Mutual labels:  jupyter-notebook
Predictive Analytics With Tensorflow
Predictive Analytics with TensorFlow, published by Packt
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Timeflow
Tensorflow for Time Series Applications
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Nsgaiii
An implementation of NSGA-III in Python.
Stars: ✭ 67 (-2.9%)
Mutual labels:  jupyter-notebook
Predictive Maintenance
Demonstration of MapR for Industrial IoT
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Machine Learning With Pyspark
Source Code for 'Machine Learning with PySpark' by Pramod Singh
Stars: ✭ 69 (+0%)
Mutual labels:  jupyter-notebook
Tensorflow
This Repository contains all tensorflow tutorials.
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Presentations
Presentations for JuliaCon
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Etl with python
ETL with Python - Taught at DWH course 2017 (TAU)
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
P Multimodal Dataset Toolbox
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Info490 Sp16
INFO 490: Advanced Data Science, offered in the Spring 2016 Semester at the University of Illinois
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Cifar10 mxnet
使用mxnet编写的kaggle CIFAR10比赛的代码
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Python
Python Tutorials
Stars: ✭ 69 (+0%)
Mutual labels:  jupyter-notebook
Encode Attend Navigate
Learning Heuristics for the TSP by Policy Gradient
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Deeplearning tutorial
Stars: ✭ 68 (-1.45%)
Mutual labels:  jupyter-notebook
Stock Market Analysis
Stock Market Analysis with RNN and Time Series
Stars: ✭ 69 (+0%)
Mutual labels:  jupyter-notebook
Kalman Filters
Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies, and produces estimates of unknown variables that tend to be more accurate than those based on a single measurement alone, by estimating a joint probability distribution over the variables for each timeframe. The filter is named after Rudolf E. Kálmán, one of the primary developers of its theory.
Stars: ✭ 69 (+0%)
Mutual labels:  jupyter-notebook
Econometrics
Code and notebooks for Econometric Theory
Stars: ✭ 67 (-2.9%)
Mutual labels:  jupyter-notebook

This is the code for Self-supervised learning of a facial attribute embedding from video in BMVC 2018.

Note that this is a refactored version of the original code, so the numbers resulting from this may not be exactly those given in the paper. More importantly, this code was run using a version of pytorch compiled from source, so using a standard pytorch may be

  • difficult to load the models and
  • give slightly different results (especially as the implementation of the sampler seems to have slightly changed between versions).

Running demo code

FAb-Net/code/demo.ipynb gives the demo code: i.e. how to load a model and predict various properties from it using a trained model and subsequently trained linear layer as described in the paper. It is self-contained. For these regressions, one file stores the original model parameters plus the linear layers. You can try on your own images or train your own linear regressor.

To run the demo code:

  • Make sure you satisfy the requirements in requirements.txt
  • Download the models from the project page.
  • Update the model paths in the notebook accordingly

Training yourself

The training code is given in FAb-Net/code/train_attention_curriculum.py.

In order to use this training code, it is necessary to download a dataset (e.g. VoxCeleb1/2). They should then be put into folders as follows and the environment variables in Datasets/config.sh updated appropriately (VOX_CELEB_1 is VoxCeleb1, VOX_CELEB_LOCATION VoxCeleb2).

For our datasets we organised the directories as:

IDENTITY
-- VIDEO
-- -- TRACK
-- -- -- frame0001.jpg
-- -- -- frame0002.jpg
-- -- -- ...
-- -- -- frameXXXX.jpg

If you arrange the folders/files as illustrated above, then you can generate np split files using Datasets/generate_large_voxceleb.py and use our dataloader. Otherwise, you may have to write your own.

Then you need to update where the model/runs are stored to by setting BASE_LOCATION in config.sh. Once this has all been done, you can train with: python train_attention_curriculum.py and point tensorboard to BASE_LOCATION/code_faces/runs/ to see how training is getting on.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].