All Projects → rust-cv → ndarray-vision

rust-cv / ndarray-vision

Licence: Apache-2.0, MIT licenses found Licenses found Apache-2.0 LICENSE-APACHE MIT LICENSE-MIT
Computer vision library built on top of ndarray

Programming Languages

rust
11053 projects

Projects that are alternatives of or similar to ndarray-vision

NDScala
N-dimensional arrays in Scala 3. Think NumPy ndarray, but type-safe over shapes, array/axis labels & numeric data types
Stars: ✭ 37 (-11.9%)
Mutual labels:  ndarray
xtensor-r
R bindings for xtensor
Stars: ✭ 83 (+97.62%)
Mutual labels:  ndarray
numphp
PHP tools for matrix computation
Stars: ✭ 25 (-40.48%)
Mutual labels:  ndarray
idx2numpy
A Python package which provides tools to convert files to and from IDX format (described at http://yann.lecun.com/exdb/mnist/) into numpy.ndarray.
Stars: ✭ 22 (-47.62%)
Mutual labels:  ndarray
h3ron
Rust crates for the H3 geospatial indexing system
Stars: ✭ 52 (+23.81%)
Mutual labels:  ndarray
Numjs
Like NumPy, in JavaScript
Stars: ✭ 1,912 (+4452.38%)
Mutual labels:  ndarray

ndarray-vision

Build Status License:MIT Coverage Status

This project is a computer vision library built on top of ndarray. This project is a work in progress. Basic image encoding/decoding and processing are currently implemented.

See the examples and tests for basic usage.

Features

  • Conversions between Grayscale, RGB, HSV and CIEXYZ
  • Image convolutions and common kernels (box linear, gaussian, laplace)
  • Median filtering
  • Sobel operator
  • Canny Edge Detection
  • Histogram Equalisation
  • Thresholding (basic, mean, Otsu)
  • Encoding and decoding PPM (binary or plaintext)

Performance

Not a lot of work has been put towards performance yet but a rudimentary benchmarking project exists here for comparative benchmarks against other image processing libraries in rust.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].