All Projects → MarcoForte → Fba_matting

MarcoForte / Fba_matting

Licence: mit
Official repository for the paper F, B, Alpha Matting

Projects that are alternatives of or similar to Fba matting

Tutorial
A tutorial for widgets
Stars: ✭ 267 (-1.84%)
Mutual labels:  jupyter-notebook
Pytorch Kaggle Starter
Pytorch starter kit for Kaggle competitions
Stars: ✭ 268 (-1.47%)
Mutual labels:  jupyter-notebook
Machine learing study
Stars: ✭ 270 (-0.74%)
Mutual labels:  jupyter-notebook
Parallel Tutorial
Parallel computing in Python tutorial materials
Stars: ✭ 268 (-1.47%)
Mutual labels:  jupyter-notebook
Noah Research
Noah Research
Stars: ✭ 265 (-2.57%)
Mutual labels:  jupyter-notebook
Gophernotes
The Go kernel for Jupyter notebooks and nteract.
Stars: ✭ 3,100 (+1039.71%)
Mutual labels:  jupyter-notebook
Cookiecutter Docker Science
Cookiecutter template for data scientists working with Docker containers
Stars: ✭ 267 (-1.84%)
Mutual labels:  jupyter-notebook
Unintended Ml Bias Analysis
Stars: ✭ 271 (-0.37%)
Mutual labels:  jupyter-notebook
Graph nn
Graph Classification with Graph Convolutional Networks in PyTorch (NeurIPS 2018 Workshop)
Stars: ✭ 268 (-1.47%)
Mutual labels:  jupyter-notebook
Notebooks Statistics And Machinelearning
Jupyter Notebooks from the old UnsupervisedLearning.com (RIP) machine learning and statistics blog
Stars: ✭ 270 (-0.74%)
Mutual labels:  jupyter-notebook
Pgmpy notebook
Short Tutorial to Probabilistic Graphical Models(PGM) and pgmpy
Stars: ✭ 268 (-1.47%)
Mutual labels:  jupyter-notebook
Facet
Human-explainable AI.
Stars: ✭ 269 (-1.1%)
Mutual labels:  jupyter-notebook
Cutblur
Rethinking Data Augmentation for Image Super-resolution (CVPR 2020)
Stars: ✭ 269 (-1.1%)
Mutual labels:  jupyter-notebook
Pytorch tiramisu
FC-DenseNet in PyTorch for Semantic Segmentation
Stars: ✭ 267 (-1.84%)
Mutual labels:  jupyter-notebook
Streamingphish
Python-based utility that uses supervised machine learning to detect phishing domains from the Certificate Transparency log network.
Stars: ✭ 271 (-0.37%)
Mutual labels:  jupyter-notebook
Deeplearning.ai Assignments
Stars: ✭ 268 (-1.47%)
Mutual labels:  jupyter-notebook
Deep Learning
No description, website, or topics provided.
Stars: ✭ 3,058 (+1024.26%)
Mutual labels:  jupyter-notebook
Rad
RAD: Reinforcement Learning with Augmented Data
Stars: ✭ 268 (-1.47%)
Mutual labels:  jupyter-notebook
Introduction To Python For Computational Science And Engineering
Book: Introduction to Python for Computational Science and Engineering
Stars: ✭ 271 (-0.37%)
Mutual labels:  jupyter-notebook
Deeplearningwithtf2.0
Practical Exercises in TensorFlow 2.0 for Ian Goodfellows Deep Learning Book
Stars: ✭ 270 (-0.74%)
Mutual labels:  jupyter-notebook

FBA Matting Open In Colab PWC License: MIT Arxiv

Official repository for the paper F, B, Alpha Matting. This paper and project is under heavy revision for peer reviewed publication, and so I will not be able to release the training code yet.
Marco Forte1, François Pitié1

1 Trinity College Dublin

Requirements

GPU memory >= 11GB for inference on Adobe Composition-1K testing set, more generally for resolutions above 1920x1080.

Packages:

  • torch >= 1.4
  • numpy
  • opencv-python

Additional Packages for jupyter notebook

  • matplotlib
  • gdown (to download model inside notebook)

Models

These models have been trained on Adobe Image Matting Dataset. They are covered by the Adobe Deep Image Mattng Dataset License Agreement so they can only be used and distributed for noncommercial purposes.
More results of this model avialiable on the alphamatting.com, the videomatting.com benchmark, and the supplementary materials PDF. | Model Name | File Size | SAD | MSE | Grad | Conn | | :------------- |------------:| :-----|----:|----:|----:| | FBA Table. 4 | 139mb | 26.4 | 5.4 | 10.6 | 21.5 |

Prediction

We provide a script demo.py and jupyter notebook which both give the foreground, background and alpha predictions of our model. The test time augmentation code will be made availiable soon.

In this video I demonstrate how to create a trimap in Pinta/Paint.NET.

Training

Training code is not released at this time. It may be released upon acceptance of the paper. Here are the key takeaways from our work with regards training.

  • Use a batch-size of 1, and use Group Normalisation and Weight Standardisation in your network.
  • Train with clipping of the alpha instead of sigmoid.
  • The L1 alpha, compositional loss and laplacian loss are beneficial. Gradient loss is not needed.
  • For foreground prediction, we extend the foreground to the entire image and define the loss on the entire image or at least the unknown region. We found this better than solely where alpha>0. Code for foreground extension

Citation

@article{forte2020fbamatting,
  title   = {F, B, Alpha Matting},
  author  = {Marco Forte and François Pitié},
  journal = {CoRR},
  volume  = {abs/2003.07711},
  year    = {2020},
}

Related works of ours

  • 99% accurate interactive object selection with just a few clicks: PDF, Code
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].