All Projects → bermanmaxim → Lovaszsoftmax

bermanmaxim / Lovaszsoftmax

Licence: mit
Code for the Lovász-Softmax loss (CVPR 2018)

Projects that are alternatives of or similar to Lovaszsoftmax

Pixel level land classification
Tutorial demonstrating how to create a semantic segmentation (pixel-level classification) model to predict land cover from aerial imagery. This model can be used to identify newly developed or flooded land. Uses ground-truth labels and processed NAIP imagery provided by the Chesapeake Conservancy.
Stars: ✭ 217 (-81.1%)
Mutual labels:  jupyter-notebook, neural-networks, image-segmentation
Sentence Aspect Category Detection
Aspect-Based Sentiment Analysis
Stars: ✭ 24 (-97.91%)
Mutual labels:  jupyter-notebook, neural-networks
Py Style Transfer
🎨 Artistic neural style transfer with tweaks (pytorch).
Stars: ✭ 23 (-98%)
Mutual labels:  jupyter-notebook, neural-networks
Deep learning projects
Stars: ✭ 28 (-97.56%)
Mutual labels:  jupyter-notebook, neural-networks
Daily Neural Network Practice 2
Daily Dose of Neural Network that Everyone Needs
Stars: ✭ 18 (-98.43%)
Mutual labels:  jupyter-notebook, neural-networks
Concise Ipython Notebooks For Deep Learning
Ipython Notebooks for solving problems like classification, segmentation, generation using latest Deep learning algorithms on different publicly available text and image data-sets.
Stars: ✭ 23 (-98%)
Mutual labels:  jupyter-notebook, image-segmentation
Udacity Deep Learning Nanodegree
This is just a collection of projects that made during my DEEPLEARNING NANODEGREE by UDACITY
Stars: ✭ 15 (-98.69%)
Mutual labels:  jupyter-notebook, neural-networks
Sciblog support
Support content for my blog
Stars: ✭ 694 (-39.55%)
Mutual labels:  jupyter-notebook, neural-networks
Bipropagation
Stars: ✭ 41 (-96.43%)
Mutual labels:  jupyter-notebook, neural-networks
Yann
This toolbox is support material for the book on CNN (http://www.convolution.network).
Stars: ✭ 41 (-96.43%)
Mutual labels:  jupyter-notebook, neural-networks
Machine Learning From Scratch
Succinct Machine Learning algorithm implementations from scratch in Python, solving real-world problems (Notebooks and Book). Examples of Logistic Regression, Linear Regression, Decision Trees, K-means clustering, Sentiment Analysis, Recommender Systems, Neural Networks and Reinforcement Learning.
Stars: ✭ 42 (-96.34%)
Mutual labels:  jupyter-notebook, neural-networks
Awesome Ai Ml Dl
Awesome Artificial Intelligence, Machine Learning and Deep Learning as we learn it. Study notes and a curated list of awesome resources of such topics.
Stars: ✭ 831 (-27.61%)
Mutual labels:  jupyter-notebook, neural-networks
Basic reinforcement learning
An introductory series to Reinforcement Learning (RL) with comprehensive step-by-step tutorials.
Stars: ✭ 826 (-28.05%)
Mutual labels:  jupyter-notebook, neural-networks
Multiclass Semantic Segmentation Camvid
Tensorflow 2 implementation of complete pipeline for multiclass image semantic segmentation using UNet, SegNet and FCN32 architectures on Cambridge-driving Labeled Video Database (CamVid) dataset.
Stars: ✭ 67 (-94.16%)
Mutual labels:  jupyter-notebook, image-segmentation
Machine Learning
머신러닝 입문자 혹은 스터디를 준비하시는 분들에게 도움이 되고자 만든 repository입니다. (This repository is intented for helping whom are interested in machine learning study)
Stars: ✭ 705 (-38.59%)
Mutual labels:  jupyter-notebook, neural-networks
Traffic Sign Classifier
Udacity Self-Driving Car Engineer Nanodegree. Project: Build a Traffic Sign Recognition Classifier
Stars: ✭ 12 (-98.95%)
Mutual labels:  jupyter-notebook, neural-networks
Mish
Official Repsoitory for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Stars: ✭ 1,072 (-6.62%)
Mutual labels:  jupyter-notebook, neural-networks
Deep Learning For Hackers
Machine Learning tutorials with TensorFlow 2 and Keras in Python (Jupyter notebooks included) - (LSTMs, Hyperameter tuning, Data preprocessing, Bias-variance tradeoff, Anomaly Detection, Autoencoders, Time Series Forecasting, Object Detection, Sentiment Analysis, Intent Recognition with BERT)
Stars: ✭ 586 (-48.95%)
Mutual labels:  jupyter-notebook, neural-networks
Tensorflow 101
TensorFlow 101: Introduction to Deep Learning for Python Within TensorFlow
Stars: ✭ 642 (-44.08%)
Mutual labels:  jupyter-notebook, neural-networks
Teacher Student Training
This repository stores the files used for my summer internship's work on "teacher-student learning", an experimental method for training deep neural networks using a trained teacher model.
Stars: ✭ 34 (-97.04%)
Mutual labels:  jupyter-notebook, neural-networks

The Lovász-Softmax loss: A tractable surrogate for the optimization of the intersection-over-union measure in neural networks

Maxim Berman, Amal Rannen Triki, Matthew B. Blaschko

ESAT-PSI, KU Leuven, Belgium.

Published in CVPR 2018. See project page, arxiv paper, paper on CVF open access.

PyTorch implementation of the loss layer (pytorch folder)

Files included:

  • lovasz_losses.py: Standalone PyTorch implementation of the Lovász hinge and Lovász-Softmax for the Jaccard index
  • demo_binary.ipynb: Jupyter notebook showcasing binary training of a linear model, with the Lovász Hinge and with the Lovász-Sigmoid.
  • demo_multiclass.ipynb: Jupyter notebook showcasing multiclass training of a linear model with the Lovász-Softmax

The binary lovasz_hinge expects real-valued scores (positive scores correspond to foreground pixels).

The multiclass lovasz_softmax expect class probabilities (the maximum scoring category is predicted). First use a Softmax layer on the unnormalized scores.

TensorFlow implementation of the loss layer (tensorflow folder)

Files included:

  • lovasz_losses_tf.py: Standalone TensorFlow implementation of the Lovász hinge and Lovász-Softmax for the Jaccard index
  • demo_binary_tf.ipynb: Jupyter notebook showcasing binary training of a linear model, with the Lovász Hinge and with the Lovász-Sigmoid.
  • demo_multiclass_tf.ipynb: Jupyter notebook showcasing the application of the multiclass loss with the Lovász-Softmax

Warning: the losses values and gradients have been tested to be the same as in PyTorch (see notebooks), however we have not used the TF implementation in a training setting.

Usage

See the demos for simple proofs of principle.

FAQ

  • How should I use the Lovász-Softmax loss?

The loss can be optimized on its own, but the optimal optimization hyperparameters (learning rates, momentum) might be different from the best ones for cross-entropy. As discussed in the paper, optimizing the dataset-mIoU (Pascal VOC measure) is dependent on the batch size and number of classes. Therefore you might have best results by optimizing with cross-entropy first and finetuning with our loss, or by combining the two losses.

See for example how the work Land Cover Classification From Satellite Imagery With U-Net and Lovasz-Softmax Loss by Alexander Rakhlin et al. used our loss in the CVPR 18 DeepGlobe challenge.

  • Inference in Tensorflow is very slow...

Compiling from Tensorflow master (or using a future distribution that includes commit tensorflow/[email protected]) should solve this problem; see issue #6.

Citation

Please cite

@inproceedings{berman2018lovasz,
  title={The Lov{\'a}sz-Softmax loss: A tractable surrogate for the optimization of the intersection-over-union measure in neural networks},
  author={Berman, Maxim and Rannen Triki, Amal and Blaschko, Matthew B},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  pages={4413--4421},
  year={2018}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].