All Projects → meng-tang → Rloss

meng-tang / Rloss

Licence: mit
Regularized Losses (rloss) for Weakly-supervised CNN Segmentation

Projects that are alternatives of or similar to Rloss

Pymc3 models
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Ipython Notebooks
Informal IPython experiments and tutorials. TensorFlow, machine learning/deep learning/RL, NLP applications.
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Mlmodels
mlmodels : Machine Learning and Deep Learning Model ZOO for Pytorch, Tensorflow, Keras, Gluon models...
Stars: ✭ 145 (+0%)
Mutual labels:  jupyter-notebook
Graphwave
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Nbashots
NBA shot charts using matplotlib, seaborn, and bokeh.
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Introduccion a python Curso online
Repositorio en el que se encontrarán diversos materiales, códigos, videos y ejercicios para el aprendizaje del lenguaje Python.
Stars: ✭ 145 (+0%)
Mutual labels:  jupyter-notebook
Unet
U-Net Biomedical Image Segmentation
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Machinelearning Az
Repositorio del Curso de Machine Learning de la A a la Z con R y Python
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Tf2 course
Notebooks for my "Deep Learning with TensorFlow 2 and Keras" course
Stars: ✭ 1,826 (+1159.31%)
Mutual labels:  jupyter-notebook
Cheat Sheets
Developer Cheatsheets
Stars: ✭ 145 (+0%)
Mutual labels:  jupyter-notebook
Cs231n
homework for CS231n 2017
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Pycroscopy
Scientific analysis of nanoscale materials imaging data
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Google2csv
Google2Csv a simple google scraper that saves the results on a csv/xlsx/jsonl file
Stars: ✭ 145 (+0%)
Mutual labels:  jupyter-notebook
Python camp
python code for pratice
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Jupyter
Stars: ✭ 145 (+0%)
Mutual labels:  jupyter-notebook
Multihead Siamese Nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Pyeng
Python for engineers
Stars: ✭ 144 (-0.69%)
Mutual labels:  jupyter-notebook
Data Driven Prediction Of Battery Cycle Life Before Capacity Degradation
Code for Nature energy manuscript
Stars: ✭ 145 (+0%)
Mutual labels:  jupyter-notebook
Scipy con 2019
Tutorial Sessions for SciPy Con 2019
Stars: ✭ 142 (-2.07%)
Mutual labels:  jupyter-notebook
Course Python Data Science
Stars: ✭ 145 (+0%)
Mutual labels:  jupyter-notebook

Regularized Losses (rloss) for Weakly-supervised CNN Segmentation

(Caffe and Pytorch)

To train CNN for semantic segmentation using weak-supervision (e.g. scribbles), we propose regularized loss framework. The loss have two parts, partial cross-entropy (pCE) loss over scribbles and regularization loss e.g. DenseCRF.

If you use the code here, please cite the following paper.

"On Regularized Losses for Weakly-supervised CNN Segmentation" PDF Meng Tang, Federico Perazzi, Abdelaziz Djelouah, Ismail Ben Ayed, Christopher Schroers, Yuri Boykov In European Conference on Computer Vision (ECCV), Munich, Germany, September 2018.

DenseCRF loss

To include DenseCRF loss for CNN, add the following loss layer. It takes two bottom blobs, first RGB image and the second is soft segmentation distributions. We need to specify bandwidth of Gaussian kernel for XY (bi_xy_std) and RGB (bi_rgb_std).

layer {
  bottom: "image"
  bottom: "segmentation"
  propagate_down: false
  propagate_down: true
  top: "densecrf_loss"
  name: "densecrf_loss"
  type: "DenseCRFLoss"
  loss_weight: ${DENSECRF_LOSS_WEIGHT}
  densecrf_loss_param {
    bi_xy_std: 100
    bi_rgb_std: 15
  }
}

The implementation of this loss layer is in:

This implementation is in CPU supporting multi-core parallelization. To enable, build with -fopenmp, see deeplab/Makefile. Some examples of visualizing the gradients of DenseCRF loss are in exper/visualization. To generate visualization yourself, run the script exper/visualize_densecrf_gradient.py.

How to train

An example script for training is given in exper/run_pascal_scribble.sh. We have training in two phases. First, we train with partial cross entropy loss. This gives mIOU of ~55.8% on VOC12 val set.

Then we fine-tune the network with extra regularization loss, e.g. DenseCRF loss. This boosts mIOU to ~62.3% on val set.

Our loss can be used for any network. For example, training better network of deeplab_msc_largeFOV gives ~63.2% mIOU on val set. Note that this is almost as good as that with full supervison (64.1%).

network weak supervision (~3% pixels labeled) full supervision
(partial) Cross Entropy Loss w/ DenseCRF Loss
Deeplab_largeFOV 55.8% 62.3% 63.0%
Deeplab_Msc_largeFOV n/a 63.2% 64.1%
Deeplab_VGG16 60.7% 64.7% 68.8%
Deeplab_ResNet101 69.5% 73.0% 75.6%

Table 1: mIOU on PASCAL VOC2012 val set

Trained models

The trained models for various networks with unregularized or regularized losses are released here.

Other Regularized Losses

In principle, our framework allows any differentialble regularization losses for segmentation, e.g. normalized cut clustering criterion and size constraint. "Normalized Cut Loss for Weakly-supervised CNN Segmentation" PDF Meng Tang, Abdelaziz Djelouah, Federico Perazzi, Yuri Boykov, Christopher Schroers In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, USA, June 2018 “Size-constraint loss for weakly supervised CNN segmentation” PDF Code Hoel Kervadec, Jose Dolz, Meng Tang, Eric Granger, Yuri Boykov, Ismail Ben Ayed In International conference on Medical Imaging with Deep Learning (MIDL), Amsterdam, Netherlands, July 2018.

Pytorch and Tensorflow

The original implementation used for the published articles is in Caffe. We release a PyTorch implementation, see pytorch. A tensorflow version is under development. We will also try other state-of-the-art network backbones with regularized losses and include in this repository.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].