All Projects → MrGiovanni → Modelsgenesis

MrGiovanni / Modelsgenesis

Licence: other
Official Keras & PyTorch Implementation and Pre-trained Models for Models Genesis - MICCAI 2019

Projects that are alternatives of or similar to Modelsgenesis

Deeppicar
Deep Learning Autonomous Car based on Raspberry Pi, SunFounder PiCar-V Kit, TensorFlow, and Google's EdgeTPU Co-Processor
Stars: ✭ 242 (-41.83%)
Mutual labels:  jupyter-notebook, transfer-learning
awesome-contrastive-self-supervised-learning
A comprehensive list of awesome contrastive self-supervised learning papers.
Stars: ✭ 748 (+79.81%)
Mutual labels:  transfer-learning, representation-learning
Link Prediction
Representation learning for link prediction within social networks
Stars: ✭ 245 (-41.11%)
Mutual labels:  jupyter-notebook, representation-learning
Bert Sklearn
a sklearn wrapper for Google's BERT model
Stars: ✭ 182 (-56.25%)
Mutual labels:  jupyter-notebook, transfer-learning
Fast Pytorch
Pytorch Tutorial, Pytorch with Google Colab, Pytorch Implementations: CNN, RNN, DCGAN, Transfer Learning, Chatbot, Pytorch Sample Codes
Stars: ✭ 346 (-16.83%)
Mutual labels:  jupyter-notebook, transfer-learning
Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (-48.8%)
Mutual labels:  jupyter-notebook, representation-learning
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+93.51%)
Mutual labels:  transfer-learning, representation-learning
Image keras
Building an image classifier using keras
Stars: ✭ 162 (-61.06%)
Mutual labels:  jupyter-notebook, transfer-learning
Ner Bert
BERT-NER (nert-bert) with google bert https://github.com/google-research.
Stars: ✭ 339 (-18.51%)
Mutual labels:  jupyter-notebook, transfer-learning
Pytorch Nlp Notebooks
Learn how to use PyTorch to solve some common NLP problems with deep learning.
Stars: ✭ 293 (-29.57%)
Mutual labels:  jupyter-notebook, transfer-learning
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+553.85%)
Mutual labels:  jupyter-notebook, representation-learning
Trainyourownyolo
Train a state-of-the-art yolov3 object detector from scratch!
Stars: ✭ 399 (-4.09%)
Mutual labels:  jupyter-notebook, transfer-learning
Pytorch Retraining
Transfer Learning Shootout for PyTorch's model zoo (torchvision)
Stars: ✭ 167 (-59.86%)
Mutual labels:  jupyter-notebook, transfer-learning
Paddlehelix
Bio-Computing Platform featuring Large-Scale Representation Learning and Multi-Task Deep Learning “螺旋桨”生物计算工具集
Stars: ✭ 213 (-48.8%)
Mutual labels:  jupyter-notebook, representation-learning
Cvpr18 Inaturalist Transfer
Large Scale Fine-Grained Categorization and Domain-Specific Transfer Learning. CVPR 2018
Stars: ✭ 164 (-60.58%)
Mutual labels:  jupyter-notebook, transfer-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-80.53%)
Mutual labels:  transfer-learning, representation-learning
Keras transfer cifar10
Object classification with CIFAR-10 using transfer learning
Stars: ✭ 120 (-71.15%)
Mutual labels:  jupyter-notebook, transfer-learning
Image classifier
CNN image classifier implemented in Keras Notebook 🖼️.
Stars: ✭ 139 (-66.59%)
Mutual labels:  jupyter-notebook, transfer-learning
Decagon
Graph convolutional neural network for multirelational link prediction
Stars: ✭ 268 (-35.58%)
Mutual labels:  jupyter-notebook, representation-learning
Amazon Forest Computer Vision
Amazon Forest Computer Vision: Satellite Image tagging code using PyTorch / Keras with lots of PyTorch tricks
Stars: ✭ 346 (-16.83%)
Mutual labels:  jupyter-notebook, transfer-learning


We have built a set of pre-trained models called Generic Autodidactic Models, nicknamed Models Genesis, because they are created ex nihilo (with no manual labeling), self-taught (learned by self-supervision), and generic (served as source models for generating application-specific target models). We envision that Models Genesis may serve as a primary source of transfer learning for 3D medical imaging applications, in particular, with limited annotated data.

Paper

This repository provides the official implementation of training Models Genesis as well as the usage of the pre-trained Models Genesis in the following paper:

Models Genesis: Generic Autodidactic Models for 3D Medical Image Analysis
Zongwei Zhou1, Vatsal Sodha1, Md Mahfuzur Rahman Siddiquee1,
Ruibin Feng1, Nima Tajbakhsh1, Michael B. Gotway2, and Jianming Liang1
1 Arizona State University, 2 Mayo Clinic
International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI), 2019
Young Scientist Award
paper | code | slides | poster | talk (YouTube, YouKu) | blog

Models Genesis
Zongwei Zhou1, Vatsal Sodha1, Jiaxuan Pang1, Michael B. Gotway2, and Jianming Liang1
1 Arizona State University, 2 Mayo Clinic
Medical Image Analysis (MedIA)
MedIA Best Paper Award
paper | code | slides

Available implementation

  • keras/
  • pytorch/

★ News: Models Genesis, incorporated with nnU-Net, rank # 1 in segmenting liver/tumor and hippocampus.

  • competition/

Major results from our work

  1. Models Genesis outperform 3D models trained from scratch
  2. Models Genesis top any 2D approaches, including ImageNet models and degraded 2D Models Genesis
  3. Models Genesis (2D) offer performances equivalent to supervised pre-trained models

The par plots presented below are produced by Matlab code in figures/plotsuperbar.m and the helper functions in figures/superbar. Credit to superbar by Scott Lowe.

Note that learning from scratch simply in 3D may not necessarily yield performance better than ImageNet-based transfer learning in 2D

Citation

If you use this code or use our pre-trained weights for your research, please cite our papers:

@InProceedings{zhou2019models,
  author="Zhou, Zongwei and Sodha, Vatsal and Rahman Siddiquee, Md Mahfuzur and Feng, Ruibin and Tajbakhsh, Nima and Gotway, Michael B. and Liang, Jianming",
  title="Models Genesis: Generic Autodidactic Models for 3D Medical Image Analysis",
  booktitle="Medical Image Computing and Computer Assisted Intervention -- MICCAI 2019",
  year="2019",
  publisher="Springer International Publishing",
  address="Cham",
  pages="384--393",
  isbn="978-3-030-32251-9",
  url="https://link.springer.com/chapter/10.1007/978-3-030-32251-9_42"
}

@article{zhou2021models,
  title="Models Genesis",
  author="Zhou, Zongwei and Sodha, Vatsal and Pang, Jiaxuan and Gotway, Michael B and Liang, Jianming",
  journal="Medical Image Analysis",
  volume = "67",
  pages = "101840",
  year = "2021",
  issn = "1361-8415",
  doi = "https://doi.org/10.1016/j.media.2020.101840",
  url = "http://www.sciencedirect.com/science/article/pii/S1361841520302048",
}

Acknowledgement

This research has been supported partially by ASU and Mayo Clinic through a Seed Grant and an Innovation Grant, and partially by the National Institutes of Health (NIH) under Award Number R01HL128785. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. This work has utilized the GPUs provided partially by the ASU Research Computing and partially by the Extreme Science and Engineering Discovery Environment (XSEDE) funded by the National Science Foundation (NSF) under grant number ACI-1548562. This is a patent-pending technology.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].