All Projects → owlbarn → owl_mask_rcnn

owlbarn / owl_mask_rcnn

Licence: other
Implementation of the Mask R-CNN model using OCaml's numerical library Owl.

Programming Languages

ocaml
1615 projects

Mask R-CNN

This is an implementation of the Mask R-CNN network using OCaml's numerical library Owl. This network can be used to perform object detection, segmentation and classification. The implementation is based on this paper and ported from this Keras implementation.

Prerequisites

  • OCaml >=4.06.0
  • CamlImages (opam install camlimages). Note that you need to install it after installing the following packages libpng12-dev libjpeg-dev libtiff-dev libxpm-dev libfreetype6-dev libgif-dev to make it support the image format you are interested in.
  • Owl's master branch (make sure it is up-to-date)
  • You need pre-trained weights to run the inference mode of the network. You can directly download the Owl weights here and place them at the root of the directory (they are converted from the Keras weights that can be found here).
  • You can then make and make run!

Images

Image The code evalImage.ml from the examples can be used to classify all the pictures in a given folder. It can be compiled with make and run with make run. A new image with highlighted objects will be generated to the results/ folder. You can modify the location of the source directory/file in examples/evalImage.ml, as well as the size of the image: a larger size yields a more accurate detection but needs more time and memory (default is 768, but you can try 512, 1024, 1536, 2048,...).

Videos

If you are patient enough, you can try to convert a video frame-by-frame by running make video (you need FFmpeg to run it). You can modify the location of the source video in examples/evalVideo.ml. Note that this writes all the frames of the video on the hard drive.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].