All Projects → Cambricon → Cnstream

Cambricon / Cnstream

Licence: apache-2.0
CNStream is a streaming framework for building Cambricon machine learning pipelines http://forum.cambricon.com https://gitee.com/SolutionSDK/CNStream

Projects that are alternatives of or similar to Cnstream

Ts Pattern
🎨 A complete Pattern Matching library for TypeScript, with smart type inference.
Stars: ✭ 854 (+721.15%)
Mutual labels:  inference
Ncnn Benchmark
The benchmark of ncnn that is a high-performance neural network inference framework optimized for the mobile platform
Stars: ✭ 70 (-32.69%)
Mutual labels:  inference
Kglab
Graph-Based Data Science: an abstraction layer in Python for building knowledge graphs, integrated with popular graph libraries – atop Pandas, RDFlib, pySHACL, RAPIDS, NetworkX, iGraph, PyVis, pslpython, pyarrow, etc.
Stars: ✭ 98 (-5.77%)
Mutual labels:  inference
Server
Serve your Rubix ML models in production with scalable stand-alone model inference servers.
Stars: ✭ 30 (-71.15%)
Mutual labels:  inference
People Counter Python
Create a smart video application using the Intel Distribution of OpenVINO toolkit. The toolkit uses models and inference to run single-class object detection.
Stars: ✭ 62 (-40.38%)
Mutual labels:  inference
Lomrf
LoMRF is an open-source implementation of Markov Logic Networks
Stars: ✭ 73 (-29.81%)
Mutual labels:  inference
Variational gradient matching for dynamical systems
Sample code for the NIPS paper "Scalable Variational Inference for Dynamical Systems"
Stars: ✭ 22 (-78.85%)
Mutual labels:  inference
Lightner
Inference with state-of-the-art models (pre-trained by LD-Net / AutoNER / VanillaNER / ...)
Stars: ✭ 102 (-1.92%)
Mutual labels:  inference
Openllet
Openllet is an OWL 2 reasoner in Java, build on top of Pellet.
Stars: ✭ 66 (-36.54%)
Mutual labels:  inference
Owl Rl
A simple implementation of the OWL2 RL Profile on top of RDFLib: it expands the graph with all possible triples that OWL RL defines. It can be used together with RDFLib to expand an RDFLib Graph object, or as a stand alone service with its own serialization.
Stars: ✭ 95 (-8.65%)
Mutual labels:  inference
Bevel
Ordinal regression in Python
Stars: ✭ 41 (-60.58%)
Mutual labels:  inference
Opencv Mtcnn
An implementation of MTCNN Face detector using OpenCV's DNN module
Stars: ✭ 59 (-43.27%)
Mutual labels:  inference
Awesome System For Machine Learning
A curated list of research in machine learning system. I also summarize some papers if I think they are really interesting.
Stars: ✭ 1,185 (+1039.42%)
Mutual labels:  inference
Bmw Classification Inference Gpu Cpu
This is a repository for an image classification inference API using the Gluoncv framework. The inference REST API works on CPU/GPU. It's supported on Windows and Linux Operating systems. Models trained using our Gluoncv Classification training repository can be deployed in this API. Several models can be loaded and used at the same time.
Stars: ✭ 27 (-74.04%)
Mutual labels:  inference
Mivisionx
MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit. AMD MIVisionX also delivers a highly optimized open-source implementation of the Khronos OpenVX™ and OpenVX™ Extensions.
Stars: ✭ 100 (-3.85%)
Mutual labels:  inference
Neuropod
A uniform interface to run deep learning models from multiple frameworks
Stars: ✭ 858 (+725%)
Mutual labels:  inference
Budgetml
Deploy a ML inference service on a budget in less than 10 lines of code.
Stars: ✭ 1,179 (+1033.65%)
Mutual labels:  inference
Delta
DELTA is a deep learning based natural language and speech processing platform.
Stars: ✭ 1,479 (+1322.12%)
Mutual labels:  inference
Sagemaker Inference Toolkit
Serve machine learning models within a 🐳 Docker container using 🧠 Amazon SageMaker.
Stars: ✭ 101 (-2.88%)
Mutual labels:  inference
Nostril
Nostril: Nonsense String Evaluator
Stars: ✭ 86 (-17.31%)
Mutual labels:  inference

Cambricon CNStream

CNStream is a streaming framework with plug-ins. It is used to connect other modules, includes basic functionalities, libraries, and essential elements.

CNStream provides the following built-in modules:

  • source: Support RTSP, video file, images and elementary stream in memory (H.264, H.265, and JPEG decoding.)
  • inference: MLU-based inference accelerator for detection and classification.
  • inference2: Based on infer server to run inference, preprocess and postprocess.
  • osd (On-screen display): Module for highlighting objects and text overlay.
  • encode: Encode videos or images.
  • display: Display the video on screen.
  • tracker: Multi-object tracking.
  • rtsp_sink:Push RTSP stream to internet
  • ipc: Make pipeline across process

Cambricon Dependencies

CNStream depends on the CNCodec library and the CNRT library which are packed in Cambricon CNToolkit package. Therefore, the lastest Cambricon CNToolkit package is required. If you do not have one, please feel free to contact with us. Our mailbox: [email protected]

Install Cambricon CNToolkit package

Ubuntu

  dpkg -i cntoolkit-x.x.x_Ubuntuxx.xx_amd64.deb
  cd /var/cntoolkit-x.x.x
  dpkg -i cncodec-xxx.deb cnrt_xxx.deb

Centos

  yum -y install cntoolkit-x.x.x.el7.x86_64.rpm
  cd /var/cntoolkit-xxx-x.x.x
  yum -y install cncodec-xxx.rpm cnrt-xxx.rpm

After that, Cambricon dependencies that CNStream needed are installed at path '/usr/loacl/neuware'.

Please make sure you must not install cnstream_xxx.deb or cnstream-xxx.rpm.

Quick Start

This section introduces how to quickly build instructions on CNStream and how to develop your own applications based on CNStream. We strongly recommend you execute pre_required_helper.sh to prepare for the environment. If not, please follow the commands below.

Required environments

Before building instructions, you need to install the following software:

  • OpenCV 2.4.9+
  • GFlags 2.1.2
  • GLog 0.3.4
  • CMake 2.8.7+
  • SDL2 2.0.4+    // If build_display=ON
  • FFmpeg 2.8 3.4 4.2

Ubuntu

If you are using Ubuntu, run the following commands:

  OpenCV 2.4.9+  >>>>>>>>>   sudo apt-get install libopencv-dev
  GFlags 2.1.2   >>>>>>>>>   sudo apt-get install libgflags-dev
  GLog 0.3.4     >>>>>>>>>   sudo apt-get install libgoogle-glog-dev
  CMake 2.8.7+   >>>>>>>>>   sudo apt-get install cmake
  SDL2 2.0.4+    >>>>>>>>>   sudo apt-get install libsdl2-dev

Centos

If you are using Centos, run the following commands:

  OpenCV 2.4.9+  >>>>>>>>>   sudo yum install opencv-devel.x86_64
  GFlags 2.1.2   >>>>>>>>>   sudo yum install gflags.x86_64
  GLog 0.3.4     >>>>>>>>>   sudo yum install glog.x86_64
  CMake 2.8.7+   >>>>>>>>>   sudo yum install cmake3.x86_64
  SDL2 2.0.4+    >>>>>>>>>   sudo yum install SDL2_gfx-devel.x86_64

Build Instructions Using CMake

After finished prerequisites, you can build project with the following steps:

  1. clone submodule easydk with command as below

    git submodule  update  --init
    
  2. Run the following command to create a directory for saving the output.

    mkdir build       # Create a directory to save the output.
    

    A Makefile is generated in the build folder.

  3. Run the following command to generate a script for building instructions.

    cd build
    cmake ${CNSTREAM_DIR}  # Generate native build scripts.
    

    Cambricon CNStream provides a CMake script (CMakeLists.txt) to build instructions. You can download CMake for free from http://www.cmake.org/.

    ${CNSTREAM_DIR} specifies the directory where CNStream saves for.

    cmake option range default description
    build_display ON / OFF ON build display module
    build_ipc ON / OFF ON build ipc module
    build_encode ON / OFF ON build encode module
    build_inference ON / OFF ON build inference module
    build_osd ON / OFF ON build osd module
    build_rtsp_sink ON / OFF ON build rtsp_sink module
    build_source ON / OFF ON build source module
    build_track ON / OFF ON build track module
    build_modules_contrib ON / OFF ON build contributed modules
    build_tests ON / OFF ON build tests
    build_samples ON / OFF ON build samples
    RELEASE ON / OFF ON release / debug
    WITH_FFMPEG ON / OFF ON build with FFMPEG
    WITH_OPENCV ON / OFF ON build with OPENCV
    WITH_FREETYPE ON / OFF OFF build with FREETYPE
    WITH_RTSP ON / OFF ON build with RTSP
  4. If you want to build CNStream samples:

    a. Run the following command:

    cmake -Dbuild_samples=ON ${CNSTREAM_DIR}
    

    b. If wanna cross compile, please follow command to:

    cmake -DCMAKE_TOOLCHAIN_FILE=${CNSTREAM_DIR}/cmake/cross-compile.cmake ${CNSTREAM_DIR}
    

    Note: you need to configure toolchain by yourself in cross-compile.cmake and cross-compile gflags, glog, opencv, ffmpeg and install them into ${CNSTREAM_DIR}

    take MLU220EDGE as example:

    cmake ${CNSTREAM_DIR} -DCMAKE_TOOLCHAIN_FILE=${CNSTREAM_DIR}/cmake/cross-compile.cmake  -DCNIS_WITH_CURL=OFF -Dbuild_display=OFF -DMLU=MLU220EDGE
    
  5. Run the following command to build instructions:

    make
    
  6. If wanna install CNStream's head files and libraries to somewhere, please add CMAKE_INSTALL_PREFIX to cmake command as below:

    cmake {CNSTREAM_DIR} -DCMAKE_INSTALL_PREFIX=/path/to/install
    make
    make install
    

Samples

Classification Object Detection
Classification Object Detection
Object Tracking Secondary Classification
Object Tracking Secondary Classification

Demo Overview

This demo shows how to detect objects using CNStream. It includes the following plug-in modules:

  • source: Decodes video streams with MLU, such as local video files, RTMP stream, and RTSP stream.
  • detector: Neural Network inference with MLU.
  • osd: Draws inference results on images.
  • displayer: Displays inference results on the screen.

In the run.sh script, detection_config.json is set as the configuration file. In this configuration file, resnet34_ssd.cambricon is the offline model used for inference, which means, the data will be fed to an SSD model after decoding. And the results will be shown on the screen.

In addition, see the comments in cnstream/samples/demo/run.sh for details.

Also there are several demos as located under classification, detection, track, secondary, rtsp etc.

Run samples

To run the CNStream sample:

  1. Follow the steps above to build instructions.

  2. Run the demo using the list below:

    cd ${CNSTREAM_DIR}/samples/demo
    
    ./run.sh
    

Best Practices

How to create an application based on CNStream?

You should find a sample from samples/example/example.cpp that helps developers easily understand how to develop an application based on CNStream pipeline.

How to change the input video file?

Modify the files.list_video file, which is under the cnstream/samples/demo directory, to replace the video path. Each line represents one stream. It is recommended to use an absolute path or use a relative path relative to the executor path.

Documentation

Cambricon Forum Docs or CNStream Read-the-Docs

Check out the Examples page for tutorials on how to use CNStream. Concepts page for basic definitions

Community forum

Discuss - General community discussion around CNStream

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].