All Projects → SUYEgit → Surgery Robot Detection Segmentation

SUYEgit / Surgery Robot Detection Segmentation

Licence: other
Object detection and segmentation for a surgery robot using Mask-RCNN on Python 3, Keras, and TensorFlow..

Projects that are alternatives of or similar to Surgery Robot Detection Segmentation

Automatic Generation Of Text Summaries
使用两种方法(抽取式Textrank和概要式seq2seq)自动提取文本摘要
Stars: ✭ 155 (+0%)
Mutual labels:  jupyter-notebook
Jupyter Server Proxy
Jupyter notebook server extension to proxy web services.
Stars: ✭ 153 (-1.29%)
Mutual labels:  jupyter-notebook
Davsod
Shifting More Attention to Video Salient Objection Detection, CVPR 2019 (Best paper finalist & Oral)
Stars: ✭ 155 (+0%)
Mutual labels:  jupyter-notebook
Example Seldon
Example for end-to-end machine learning on Kubernetes using Kubeflow and Seldon Core
Stars: ✭ 154 (-0.65%)
Mutual labels:  jupyter-notebook
Matplotlib Label Lines
Label line using matplotlib.
Stars: ✭ 154 (-0.65%)
Mutual labels:  jupyter-notebook
Neural Style Transfer
Keras Implementation of Neural Style Transfer from the paper "A Neural Algorithm of Artistic Style" (http://arxiv.org/abs/1508.06576) in Keras 2.0+
Stars: ✭ 2,000 (+1190.32%)
Mutual labels:  jupyter-notebook
Daguan Classify 2018
2018达观杯长文本分类智能处理挑战赛 18解决方案
Stars: ✭ 154 (-0.65%)
Mutual labels:  jupyter-notebook
Pytorchmedicalai
This is the hands-on deep learning tutorial series for the 2018/2019 Medical AI course by DeepOncology AI.
Stars: ✭ 155 (+0%)
Mutual labels:  jupyter-notebook
Stock Market Prediction Challenge
Following repo is the solution to Stock Market Prediction using Neural Networks and Sentiment Analysis
Stars: ✭ 154 (-0.65%)
Mutual labels:  jupyter-notebook
Jupyter Vim Binding
Jupyter meets Vim. Vimmer will fall in love.
Stars: ✭ 1,965 (+1167.74%)
Mutual labels:  jupyter-notebook
Deep Viz Keras
Implementations of some popular Saliency Maps in Keras
Stars: ✭ 154 (-0.65%)
Mutual labels:  jupyter-notebook
Cnnvis Pytorch
visualization of CNN in PyTorch
Stars: ✭ 154 (-0.65%)
Mutual labels:  jupyter-notebook
Deepreinforcementlearning
A replica of the AlphaZero methodology for deep reinforcement learning in Python
Stars: ✭ 1,898 (+1124.52%)
Mutual labels:  jupyter-notebook
Data Science Stack Cookiecutter
🐳📊🤓Cookiecutter template to launch an awesome dockerized Data Science toolstack (incl. Jupyster, Superset, Postgres, Minio, AirFlow & API Star)
Stars: ✭ 153 (-1.29%)
Mutual labels:  jupyter-notebook
Your First Kaggle Submission
How to perform an exploratory data analysis on the Kaggle Titanic dataset and make a submission to the leaderboard.
Stars: ✭ 155 (+0%)
Mutual labels:  jupyter-notebook
Nlp Interview Notes
📚 专门为自然语言处理(NLP)面试准备的学习笔记与资料
Stars: ✭ 154 (-0.65%)
Mutual labels:  jupyter-notebook
Binderhub
Run your code in the cloud, with technology so advanced, it feels like magic!
Stars: ✭ 2,050 (+1222.58%)
Mutual labels:  jupyter-notebook
Spiking Neural Network Snn With Pytorch Where Backpropagation Engenders Stdp
What about coding a Spiking Neural Network using an automatic differentiation framework? In SNNs, there is a time axis and the neural network sees data throughout time, and activation functions are instead spikes that are raised past a certain pre-activation threshold. Pre-activation values constantly fades if neurons aren't excited enough.
Stars: ✭ 155 (+0%)
Mutual labels:  jupyter-notebook
Stocks
Programs for stock prediction and evaluation
Stars: ✭ 155 (+0%)
Mutual labels:  jupyter-notebook
Pyportfolioopt
Financial portfolio optimisation in python, including classical efficient frontier, Black-Litterman, Hierarchical Risk Parity
Stars: ✭ 2,502 (+1514.19%)
Mutual labels:  jupyter-notebook

Mask R-CNN for Surgery Robot

This is a project of NUS Control & Mechatronics Lab for surgical robot target detection and segmentation under guidance of Prof. Chui Chee Kong. Information on the research group can be found in http://blog.nus.edu.sg/mpecck/.

The codes are based on implementation of Mask R-CNN by (https://github.com/matterport/Mask_RCNN) on Python 3, Keras, and TensorFlow. The model generates bounding boxes and segmentation masks for each instance of an object in the image. It's based on Feature Pyramid Network (FPN) and a ResNet101 backbone.

The repository includes:

  • Source code of Mask R-CNN built on FPN and ResNet101.
  • Instruction and training code for the surgery robot dataset.
  • Pre-trained weights on MS COCO and ImageNet.
  • Example of training on your own dataset, with emphasize on how to build and adapt codes to dataset with multiple classes.
  • Jupyter notebooks to visualize the detection result.

[Mask RCNN on 4K Video]

Training on Your own Dataset

Pre-trained weights from MS COCO and ImageNet are provided for you to fine-tune over new dataset. Start by reading this blog post about the balloon color splash sample. It covers the process starting from annotating images to training to using the results in a sample application.

In summary, to train the model you need to modify two classes in surgery.py:

  1. SurgeryConfig This class contains the default configurations. Modify the attributes for your training, most importantly the NUM_CLASSES.
  2. SurgeryDataset This class inherits from utils.Dataset which provides capability to train on new dataset without modifying the model. In this project I will demonstrate with a dataset labeled by VGG Image Annotation(VIA). If you are also trying to label a dataset for your own images, start by reading this blog post about the balloon color splash sample. First of all, for training you need to add class in function load_VIA
self.add_class("SourceName", ClassID, "ClassName")
#For example:
self.add_class("surgery", 1, "arm")  #means add a class named "arm" with class_id "1" from source "surgery"
......

Then extend function load_mask for reading different class names from annotations For example, if you assign name "a" to class "arm" when you are labelling, according to its class_id defined in load_VIA

class_ids = np.zeros([len(info["polygons"])])
for i, p in enumerate(class_names):
   if p['name'] == 'a':
      class_ids[i] = 1
      ......
  1. The data directories for this project are as following. Make sure you include corresponding annotations(.json) in correct directory.

Example of data directory

Now you should be able to start training on your own dataset! Training parapeters are mainly included in function train in surgery.py.

#Train a new model starting from pre-trained COCO weights
python surgery.py train --dataset=/home/.../mask_rcnn/data/surgery/ --weights=coco  

#Train a new model starting from pre-trained ImageNet weights
python surgery.py train --dataset=/home/.../mask_rcnn/data/surgery/ --weights=imagenet

# Continue training the last model you trained. This will find
# the last trained weights in the model directory.
python surgery.py train --dataset=/home/.../mask_rcnn/data/surgery/ --weights=last

Prediction, Visualization, Evaluation

Functiondetect_and_color_splash in surgery.py are provided in this project. To use detect_and_color_splash, you need to add class_names according to your dataset

class_names = ['BG', 'arm', 'ring']

You can make prediction on a specific image, images in a specific directory or even a video, by

#Detect and color splash on a image with the last model you trained.
#This will find the last trained weights in the model directory.
python surgery.py splash --weights=last --image=/home/...../*.jpg

#Detect and color splash on a video with a specific pre-trained weights of yours.
python sugery.py splash --weights=/home/.../logs/mask_rcnn_surgery_0030.h5  --video=/home/simon/Videos/Center.wmv
  • prediction.ipynb provides step-by-step prediction and visualization on your own dataset. You can also roughly evaluate the model with metrics of overall accuracy and precision.

Instance Segmentation Samples on Robot Dataset

The model is trained based on pre-trained weights for MS COCO. Instance Segmentation Sample2 Instance Segmentation Sample Instance Segmentation Sample2

Configurations

Anaconda + Python 3.6.4, TensorFlow 1.7.0, Keras 2.1.5, CUDA 9.0, cudnn 7 and other common packages listed in requirements.txt.

Installation

  1. Install dependencies
    pip install -r requirements.txt
    
  2. Clone this repository
  3. Run setup from the repository root directory
    python setup.py install
    
  4. The code will automatically download pretrained COCO weights when you select training with COCO weights. But in case it somehow doesn't work, download pre-trained COCO weights (mask_rcnn_coco.h5) from the releases page.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].