All Projects → Mehrdad-Noori → Brain-Tumor-Segmentation

Mehrdad-Noori / Brain-Tumor-Segmentation

Licence: other
Attention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Brain-Tumor-Segmentation

Brain-Tumor-Segmentation-using-Topological-Loss
A Tensorflow Implementation of Brain Tumor Segmentation using Topological Loss
Stars: ✭ 28 (-77.6%)
Mutual labels:  brats, brain-tumor-segmentation
Brainy
Brainy is a virtual MRI analyzer. Just upload the MRI scan file and get 3 different classes of tumors detected and segmented. In Beta.
Stars: ✭ 29 (-76.8%)
Mutual labels:  mri, brain-tumor-segmentation
Brain-MRI-Segmentation
Smart India Hackathon 2019 project given by the Department of Atomic Energy
Stars: ✭ 29 (-76.8%)
Mutual labels:  mri, brain-tumor-segmentation
pku-Artificial-intelligence-practice-homework
2019北京大学软件与微电子学院曹健老师的《人工智能实践》作业,有完整的注释,欢迎提出issue以及request
Stars: ✭ 45 (-64%)
Mutual labels:  tensorflow2
axial-attention
Implementation of Axial attention - attending to multi-dimensional data efficiently
Stars: ✭ 245 (+96%)
Mutual labels:  attention-mechanism
efficient-attention
An implementation of the efficient attention module.
Stars: ✭ 191 (+52.8%)
Mutual labels:  attention-mechanism
unet-pytorch
This is the example implementation of UNet model for semantic segmentations
Stars: ✭ 17 (-86.4%)
Mutual labels:  u-net
TF2-GAN
🐳 GAN implemented as Tensorflow 2.X
Stars: ✭ 61 (-51.2%)
Mutual labels:  tensorflow2
GLOM-TensorFlow
An attempt at the implementation of GLOM, Geoffrey Hinton's paper for emergent part-whole hierarchies from data
Stars: ✭ 32 (-74.4%)
Mutual labels:  tensorflow2
DCAN
[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (-78.4%)
Mutual labels:  attention-mechanism
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-91.2%)
Mutual labels:  attention-mechanism
coursera-gan-specialization
Programming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+121.6%)
Mutual labels:  u-net
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-3.2%)
Mutual labels:  attention-mechanism
W-Net-Keras
An unofficial implementation of W-Net for crowd counting.
Stars: ✭ 20 (-84%)
Mutual labels:  u-net
Machine-Translation-Hindi-to-english-
Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-84.8%)
Mutual labels:  attention-mechanism
fastmri-reproducible-benchmark
Try several methods for MRI reconstruction on the fastmri dataset. Home to the XPDNet, runner-up of the 2020 fastMRI challenge.
Stars: ✭ 117 (-6.4%)
Mutual labels:  mri
DOSMA
An AI-powered open-source medical image analysis toolbox
Stars: ✭ 45 (-64%)
Mutual labels:  mri
SiGAT
source code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (-70.4%)
Mutual labels:  attention-mechanism
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+278.4%)
Mutual labels:  attention-mechanism
DolboNet
Русскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (-57.6%)
Mutual labels:  tensorflow2

The source code for our paper "Attention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation"

Our paper can be found at this link.

Overview

Dataset

The BraTS data set is used for training and evaluating the model. This dataset contains four modalities for each individual brain, namely, T1, T1c (post-contrast T1), T2, and Flair which were skull-stripped, resampled and coregistered. For more information, please refer to the main site.

Pre-processing

For pre-processing the data, firstly, N4ITK algorithm is adopted on each MRI modalities to correct the inhomogeneity of these images. Secondly, 1% of the top and bottom intensities is removed, and then each modality is normalized to zero mean and unit variance.

Architecture


image


The network is based on U-Net architecture with some modifications as follows:

  • The minor modifications: adding Residual Units, strided convolution, PReLU activation and Batch Normalization layers to the original U-Net
  • The attention mechanism: employing Squeeze and Excitation Block (SE) on concatenated multi-level features. This technique prevents confusion for the model by weighting each of the channels adaptively (please refer to our paper for more information).


Training Process

Since our proposed network is a 2D architecture, we need to extract 2D slices from 3D volumes of MRI images. To benefit from 3D contextual information of input images, we extract 2D slices from both Axial and Coronal views, and then train a network for each view separately. In the test time, we build the 3D output volume for each model by concatenating the 2D predicted maps. Finally, we fuse the two views by pixel-wise averaging.



Results

The results are obtained from the BraTS online evaluation platform using the BraTS 2018 validation set.



image


Dependencies

Usage

1- Download the BRATS 2019, 2018 or 2017 data by following the steps described in BraTS

2- Perform N4ITK bias correction using ANTs, follow the steps in this repo (this step is optional)

3- Set the path to all brain volumes in config.py (ex: cfg['data_dir'] = './BRATS19/MICCAI_BraTS_2019_Data_Training/*/*/')

4- To read, preprocess and save all brain volumes into a single table file:

python prepare_data.py

5- To Run the training:

python train.py

The model can be trained from axial, saggital or coronal views (set cfg['view'] in the config.py). Moreover, K-fold cross-validation can be used (set cfg['k_fold'] in the config.py)

6- To predict and save label maps:

python predict.py

The predictions will be written in .nii.gz format and can be uploaded to BraTS online evaluation platform.

Citation

@inproceedings{noori2019attention,
  title={Attention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation},
  author={Noori, Mehrdad and Bahri, Ali and Mohammadi, Karim},
  booktitle={2019 9th International Conference on Computer and Knowledge Engineering (ICCKE)},
  pages={269--275},
  year={2019},
  organization={IEEE}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].