All Projects → qubvel → Segmentation_models

qubvel / Segmentation_models

Licence: mit
Segmentation models with pretrained backbones. Keras and TensorFlow Keras.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Segmentation models

Segmentation models.pytorch
Segmentation models with pretrained backbones. PyTorch.
Stars: ✭ 4,584 (+28.22%)
Mutual labels:  segmentation, image-segmentation, unet, pspnet, fpn, linknet, segmentation-models
image-segmentation
Mask R-CNN, FPN, LinkNet, PSPNet and UNet with multiple backbone architectures support readily available
Stars: ✭ 62 (-98.27%)
Mutual labels:  unet, pre-trained, pspnet, pretrained, fpn, linknet
TensorFlow-Advanced-Segmentation-Models
A Python Library for High-Level Semantic Segmentation Models based on TensorFlow and Keras with pretrained backbones.
Stars: ✭ 64 (-98.21%)
Mutual labels:  segmentation, image-segmentation, unet, pspnet, fpn, segmentation-models
Paddlex
PaddlePaddle End-to-End Development Toolkit(『飞桨』深度学习全流程开发工具)
Stars: ✭ 3,399 (-4.92%)
Mutual labels:  segmentation, resnet, mobilenet, unet
Pytorch2keras
PyTorch to Keras model convertor
Stars: ✭ 676 (-81.09%)
Mutual labels:  resnet, keras-tensorflow, densenet, keras-models
pytorch2keras
PyTorch to Keras model convertor
Stars: ✭ 788 (-77.96%)
Mutual labels:  densenet, resnet, keras-models, keras-tensorflow
Keras Idiomatic Programmer
Books, Presentations, Workshops, Notebook Labs, and Model Zoo for Software Engineers and Data Scientists wanting to learn the TF.Keras Machine Learning framework
Stars: ✭ 720 (-79.86%)
Mutual labels:  resnet, mobilenet, densenet, resnext
Medicalzoopytorch
A pytorch-based deep learning framework for multi-modal 2D/3D medical image segmentation
Stars: ✭ 546 (-84.73%)
Mutual labels:  segmentation, resnet, unet, densenet
Classification models
Classification models trained on ImageNet. Keras.
Stars: ✭ 938 (-73.76%)
Mutual labels:  resnet, mobilenet, densenet, resnext
Keras Unet
Helper package with multiple U-Net implementations in Keras as well as useful utility tools helpful when working with image semantic segmentation tasks. This library and underlying tools come from multiple projects I performed working on semantic segmentation tasks
Stars: ✭ 196 (-94.52%)
Mutual labels:  segmentation, image-segmentation, keras-tensorflow, unet
Brain-MRI-Segmentation
Smart India Hackathon 2019 project given by the Department of Atomic Energy
Stars: ✭ 29 (-99.19%)
Mutual labels:  segmentation, image-segmentation, unet, keras-tensorflow
Pytorch Cifar100
Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet, NasNet, Residual Attention Network, SENet, WideResNet)
Stars: ✭ 2,423 (-32.22%)
Mutual labels:  resnet, mobilenet, densenet, resnext
TensorMONK
A collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-99.41%)
Mutual labels:  densenet, resnet, unet, efficientnet
Segmentationcpp
A c++ trainable semantic segmentation library based on libtorch (pytorch c++). Backbone: ResNet, ResNext. Architecture: FPN, U-Net, PAN, LinkNet, PSPNet, DeepLab-V3, DeepLab-V3+ by now.
Stars: ✭ 49 (-98.63%)
Mutual labels:  resnet, image-segmentation, unet, resnext
Tianchi Medical Lungtumordetect
天池医疗AI大赛[第一季]:肺部结节智能诊断 UNet/VGG/Inception/ResNet/DenseNet
Stars: ✭ 314 (-91.22%)
Mutual labels:  segmentation, resnet, unet, densenet
Caffe Model
Caffe models (including classification, detection and segmentation) and deploy files for famouse networks
Stars: ✭ 1,258 (-64.81%)
Mutual labels:  segmentation, resnet, resnext
Multiclass Semantic Segmentation Camvid
Tensorflow 2 implementation of complete pipeline for multiclass image semantic segmentation using UNet, SegNet and FCN32 architectures on Cambridge-driving Labeled Video Database (CamVid) dataset.
Stars: ✭ 67 (-98.13%)
Mutual labels:  segmentation, image-segmentation, unet
Hyperdensenet
This repository contains the code of HyperDenseNet, a hyper-densely connected CNN to segment medical images in multi-modal image scenarios.
Stars: ✭ 124 (-96.53%)
Mutual labels:  segmentation, image-segmentation, densenet
Jacinto Ai Devkit
Training & Quantization of embedded friendly Deep Learning / Machine Learning / Computer Vision models
Stars: ✭ 49 (-98.63%)
Mutual labels:  segmentation, resnet, mobilenet
cifar-tensorflow
No description or website provided.
Stars: ✭ 18 (-99.5%)
Mutual labels:  densenet, resnet, mobilenet

Python library with Neural Networks for Image Segmentation based on Keras and TensorFlow.

The main features of this library are:

  • High level API (just two lines of code to create model for segmentation)
  • 4 models architectures for binary and multi-class image segmentation (including legendary Unet)
  • 25 available backbones for each architecture
  • All backbones have pre-trained weights for faster and better convergence
  • Helpful segmentation losses (Jaccard, Dice, Focal) and metrics (IoU, F-score)

Important note

Some models of version 1.* are not compatible with previously trained models, if you have such models and want to load them - roll back with:

$ pip install -U segmentation-models==0.2.1

Table of Contents

Quick start

Library is build to work together with Keras and TensorFlow Keras frameworks

import segmentation_models as sm
# Segmentation Models: using `keras` framework.

By default it tries to import keras, if it is not installed, it will try to start with tensorflow.keras framework. There are several ways to choose framework:

  • Provide environment variable SM_FRAMEWORK=keras / SM_FRAMEWORK=tf.keras before import segmentation_models
  • Change framework sm.set_framework('keras') / sm.set_framework('tf.keras')

You can also specify what kind of image_data_format to use, segmentation-models works with both: channels_last and channels_first. This can be useful for further model conversion to Nvidia TensorRT format or optimizing model for cpu/gpu computations.

import keras
# or from tensorflow import keras

keras.backend.set_image_data_format('channels_last')
# or keras.backend.set_image_data_format('channels_first')

Created segmentation model is just an instance of Keras Model, which can be build as easy as:

model = sm.Unet()

Depending on the task, you can change the network architecture by choosing backbones with fewer or more parameters and use pretrainded weights to initialize it:

model = sm.Unet('resnet34', encoder_weights='imagenet')

Change number of output classes in the model (choose your case):

# binary segmentation (this parameters are default when you call Unet('resnet34')
model = sm.Unet('resnet34', classes=1, activation='sigmoid')
# multiclass segmentation with non overlapping class masks (your classes + background)
model = sm.Unet('resnet34', classes=3, activation='softmax')
# multiclass segmentation with independent overlapping/non-overlapping class masks
model = sm.Unet('resnet34', classes=3, activation='sigmoid')

Change input shape of the model:

# if you set input channels not equal to 3, you have to set encoder_weights=None
# how to handle such case with encoder_weights='imagenet' described in docs
model = Unet('resnet34', input_shape=(None, None, 6), encoder_weights=None)

Simple training pipeline

import segmentation_models as sm

BACKBONE = 'resnet34'
preprocess_input = sm.get_preprocessing(BACKBONE)

# load your data
x_train, y_train, x_val, y_val = load_data(...)

# preprocess input
x_train = preprocess_input(x_train)
x_val = preprocess_input(x_val)

# define model
model = sm.Unet(BACKBONE, encoder_weights='imagenet')
model.compile(
    'Adam',
    loss=sm.losses.bce_jaccard_loss,
    metrics=[sm.metrics.iou_score],
)

# fit model
# if you use data generator use model.fit_generator(...) instead of model.fit(...)
# more about `fit_generator` here: https://keras.io/models/sequential/#fit_generator
model.fit(
   x=x_train,
   y=y_train,
   batch_size=16,
   epochs=100,
   validation_data=(x_val, y_val),
)

Same manipulations can be done with Linknet, PSPNet and FPN. For more detailed information about models API and use cases Read the Docs.

Examples

Models training examples:
  • [Jupyter Notebook] Binary segmentation (cars) on CamVid dataset here.
  • [Jupyter Notebook] Multi-class segmentation (cars, pedestrians) on CamVid dataset here.

Models and Backbones

Models

Unet Linknet
unet_image linknet_image
PSPNet FPN
psp_image fpn_image

Backbones

Type Names
VGG 'vgg16' 'vgg19'
ResNet 'resnet18' 'resnet34' 'resnet50' 'resnet101' 'resnet152'
SE-ResNet 'seresnet18' 'seresnet34' 'seresnet50' 'seresnet101' 'seresnet152'
ResNeXt 'resnext50' 'resnext101'
SE-ResNeXt 'seresnext50' 'seresnext101'
SENet154 'senet154'
DenseNet 'densenet121' 'densenet169' 'densenet201'
Inception 'inceptionv3' 'inceptionresnetv2'
MobileNet 'mobilenet' 'mobilenetv2'
EfficientNet 'efficientnetb0' 'efficientnetb1' 'efficientnetb2' 'efficientnetb3' 'efficientnetb4' 'efficientnetb5' efficientnetb6' efficientnetb7'
All backbones have weights trained on 2012 ILSVRC ImageNet dataset (encoder_weights='imagenet').

Installation

Requirements

  1. python 3
  2. keras >= 2.2.0 or tensorflow >= 1.13
  3. keras-applications >= 1.0.7, <=1.0.8
  4. image-classifiers == 1.0.*
  5. efficientnet == 1.0.*

PyPI stable package

$ pip install -U segmentation-models

PyPI latest package

$ pip install -U --pre segmentation-models

Source latest version

$ pip install git+https://github.com/qubvel/segmentation_models

Documentation

Latest documentation is avaliable on Read the Docs

Change Log

To see important changes between versions look at CHANGELOG.md

Citing

@misc{Yakubovskiy:2019,
  Author = {Pavel Yakubovskiy},
  Title = {Segmentation Models},
  Year = {2019},
  Publisher = {GitHub},
  Journal = {GitHub repository},
  Howpublished = {\url{https://github.com/qubvel/segmentation_models}}
}

License

Project is distributed under MIT Licence.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].