All Projects → constantinpape → cluster_tools

constantinpape / cluster_tools

Licence: MIT License
Distributed segmentation for bio-image-analysis

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to cluster tools

mutex-watershed
The mutex watershed for image segmentation.
Stars: ✭ 54 (+107.69%)
Mutual labels:  segmentation, watershed, mutex-watershed
UNI-EM
A unified environment for DNN-based automated segmentation of neuronal EM images
Stars: ✭ 33 (+26.92%)
Mutual labels:  segmentation, connectomics
Brain-Tumor-Segmentation-using-Topological-Loss
A Tensorflow Implementation of Brain Tumor Segmentation using Topological Loss
Stars: ✭ 28 (+7.69%)
Mutual labels:  segmentation, 3d-segmentation
shellnet
ShellNet: Efficient Point Cloud Convolutional Neural Networks using Concentric Shells Statistics
Stars: ✭ 80 (+207.69%)
Mutual labels:  segmentation
instant-segment
Fast English word segmentation in Rust
Stars: ✭ 49 (+88.46%)
Mutual labels:  segmentation
pyconvsegnet
Semantic Segmentation PyTorch code for our paper: Pyramidal Convolution: Rethinking Convolutional Neural Networks for Visual Recognition (https://arxiv.org/pdf/2006.11538.pdf)
Stars: ✭ 32 (+23.08%)
Mutual labels:  segmentation
face video segment
Face Video Segmentation - Face segmentation ground truth from videos
Stars: ✭ 84 (+223.08%)
Mutual labels:  segmentation
HoughRectangle
Rectangle detection using the Hough transform
Stars: ✭ 76 (+192.31%)
Mutual labels:  segmentation
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (+96.15%)
Mutual labels:  segmentation
LineSegm
Line Segmentation of Handwritten Documents using the A* Path Planning Algorithm
Stars: ✭ 19 (-26.92%)
Mutual labels:  segmentation
3DSemanticMapping JINT 2020
Repository for the paper "Extending Maps with Semantic and Contextual Object Information for Robot Navigation: a Learning-Based Framework using Visual and Depth Cues"
Stars: ✭ 38 (+46.15%)
Mutual labels:  3d-segmentation
coursera-ai-for-medicine-specialization
Programming assignments, labs and quizzes from all courses in the Coursera AI for Medicine Specialization offered by deeplearning.ai
Stars: ✭ 80 (+207.69%)
Mutual labels:  segmentation
mSRGAN-A-GAN-for-single-image-super-resolution-on-high-content-screening-microscopy-images.
Generative Adversarial Network for single image super-resolution in high content screening microscopy images
Stars: ✭ 52 (+100%)
Mutual labels:  microscopy-images
mri-deep-learning-tools
Resurces for MRI images processing and deep learning in 3D
Stars: ✭ 56 (+115.38%)
Mutual labels:  segmentation
pointnet2-pytorch
A clean PointNet++ segmentation model implementation. Support batch of samples with different number of points.
Stars: ✭ 45 (+73.08%)
Mutual labels:  segmentation
lite.ai.toolkit
🛠 A lite C++ toolkit of awesome AI models with ONNXRuntime, NCNN, MNN and TNN. YOLOX, YOLOP, MODNet, YOLOR, NanoDet, YOLOX, SCRFD, YOLOX . MNN, NCNN, TNN, ONNXRuntime, CPU/GPU.
Stars: ✭ 1,354 (+5107.69%)
Mutual labels:  segmentation
Semantic-Aware-Attention-Based-Deep-Object-Co-segmentation
Semantic Aware Attention Based Deep Object Co-segmentation
Stars: ✭ 61 (+134.62%)
Mutual labels:  segmentation
brainreg-segment
Segmentation of 3D shapes in a common anatomical space
Stars: ✭ 13 (-50%)
Mutual labels:  segmentation
retinal-exudates-detection
exudates detection using hybrid approach (Image Morphology & Machine Learning)
Stars: ✭ 53 (+103.85%)
Mutual labels:  segmentation
x-force
winning sloution of Digtial Manfacturing Algorithm Competition II of JinNan Tianjin
Stars: ✭ 56 (+115.38%)
Mutual labels:  segmentation

Anaconda-Server Badge

Cluster Tools

Workflows for distributed Bio Image Analysis and Segmentation. Supports Slurm, LSF and local execution, easy to extend to more scheduling systems.

Workflows

Installation

You can install the package via conda:

conda install -c conda-forge cluster_tools

To set-up a develoment environment with all necessary dependencies, you can use the environment.yml file:

conda env create -f environment.yml

and then install the package in development mode via

pip install -e . --no-deps

Citation

If you use this software in a publication, please cite

Pape, Constantin, et al. "Solving large multicut problems for connectomics via domain decomposition." Proceedings of the IEEE International Conference on Computer Vision. 2017.

For the lifted multicut workflows, please cite

Pape, Constantin, et al. "Leveraging Domain Knowledge to improve EM image segmentation with Lifted Multicuts." arXiv preprint. 2019.

You can find code for the experiments in publications/lifted_domain_knowledge.

If you are using another algorithom not part of these two publications, please also cite the appropriate publication (see the links here).

Getting Started

This repository uses luigi for workflow management. We support different cluster schedulers, so far

  • slurm
  • lsf
  • local (local execution based on ProcessPool)

The scheduler can be selected by the keyword target. Inter-process communication is achieved through files which are stored in a temporary folder and most workflows use n5 storage. You can use z5 to convert files to it with python.

Simplified, running a workflow from this repository looks like this:

import json
import luigi
from cluster_tools import SimpleWorkflow  # this is just a mock class, not actually part of this repository

# folder for temporary scripts and files
tmp_folder = 'tmp_wf'

# directory for configurations for workflow sub-tasks stored as json
config_dir = 'configs'

# get the default configurations for all sub-tasks
default_configs = SimpleWorkflow.get_config()

# global configuration for shebang to proper python interpreter with all dependencies,
# group name and block-shape
global_config = default_configs['global']
shebang = '#! /path/to/bin/python'
global_config.update({'shebang': shebang, 'groupname': 'mygroup'})
with open('configs/global.config', 'w') as f:
  json.dump(global_config, f)
  
# run the example workflow with `max_jobs` number of jobs
max_jobs = 100
task = SimpleWorkflow(tmp_folder=tmp_folder, config_dir=config_dir,
                      target='slurm', max_jobs=max_jobs,
                      input_path='/path/to/input.n5', input_key='data',
                      output_path='/path/to/output.n5', output_key='data')
luigi.build([task])

For a list of the available segmentation worklfows, have a look at this. Unfortunately, there is no proper documentation yet. For more details, have a look at the examples, in particular this example. You can donwload the example data (also used for the tests) here.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].