All Projects → avinashshenoy97 → Rusticsom

avinashshenoy97 / Rusticsom

Licence: mit
Rust library for Self Organising Maps (SOM).

Programming Languages

rust
11053 projects

Projects that are alternatives of or similar to Rusticsom

version-compare
↔️ Rust library to easily compare version strings. Mirror from https://gitlab.com/timvisee/version-compare
Stars: ✭ 32 (+77.78%)
Mutual labels:  crates, rust-library
colorful
Make your terminal output colorful.
Stars: ✭ 43 (+138.89%)
Mutual labels:  crates, rust-library
Mnn
MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba
Stars: ✭ 6,284 (+34811.11%)
Mutual labels:  ml
Awesome Ai Ml Dl
Awesome Artificial Intelligence, Machine Learning and Deep Learning as we learn it. Study notes and a curated list of awesome resources of such topics.
Stars: ✭ 831 (+4516.67%)
Mutual labels:  ml
Flutter Ai Rubik Cube Solver
Flutter-Python rubiks cube solver.
Stars: ✭ 744 (+4033.33%)
Mutual labels:  ml
Gocaml
🐫 Practical statically typed functional programming language implementation with Go and LLVM
Stars: ✭ 653 (+3527.78%)
Mutual labels:  ml
Not Yet Awesome Rust
A curated list of Rust code and resources that do NOT exist yet, but would be beneficial to the Rust community.
Stars: ✭ 789 (+4283.33%)
Mutual labels:  rust-library
Bracket Lib
The Roguelike Toolkit (RLTK), implemented for Rust.
Stars: ✭ 631 (+3405.56%)
Mutual labels:  crates
Superslice Rs
Extensions for ordered Rust slices.
Stars: ✭ 17 (-5.56%)
Mutual labels:  rust-library
Lopdf
A Rust library for PDF document manipulation.
Stars: ✭ 720 (+3900%)
Mutual labels:  rust-library
Rage
A simple, secure and modern encryption tool (and Rust library) with small explicit keys, no config options, and UNIX-style composability.
Stars: ✭ 826 (+4488.89%)
Mutual labels:  rust-library
Colored
(Rust) Coloring terminal so simple you already know how to do it !
Stars: ✭ 715 (+3872.22%)
Mutual labels:  crates
Windows Machine Learning
Samples and Tools for Windows ML.
Stars: ✭ 663 (+3583.33%)
Mutual labels:  ml
Ritual
Use C++ libraries from Rust
Stars: ✭ 792 (+4300%)
Mutual labels:  crates
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (+3500%)
Mutual labels:  ml
Wheels
Performance-optimized wheels for TensorFlow (SSE, AVX, FMA, XLA, MPI)
Stars: ✭ 891 (+4850%)
Mutual labels:  ml
Ffdl
Fabric for Deep Learning (FfDL, pronounced fiddle) is a Deep Learning Platform offering TensorFlow, Caffe, PyTorch etc. as a Service on Kubernetes
Stars: ✭ 640 (+3455.56%)
Mutual labels:  ml
Quicksilver
A simple framework for 2D games on desktop and web
Stars: ✭ 710 (+3844.44%)
Mutual labels:  rust-library
Aoe
AoE (AI on Edge,终端智能,边缘计算) 是一个终端侧AI集成运行时环境 (IRE),帮助开发者提升效率。
Stars: ✭ 759 (+4116.67%)
Mutual labels:  ml
Libimagequant Rust
libimagequant (pngquant) bindings for the Rust language
Stars: ✭ 17 (-5.56%)
Mutual labels:  rust-library

RusticSOM

Rust library for Self Organising Maps (SOM).

Status Open Source Love License Build Status

Using this Crate

Add rusticsom as a dependency in Cargo.toml

[dependencies]
rusticsom = "1.1.0"

Include the crate

use rusticsom::SOM;

API

Use SOM::create to create an SOM object using the API call below, which creates an SOM with length x breadth cells and accepts neurons of length inputs.

pub fn create(length: usize, breadth: usize, inputs: usize, randomize: bool, learning_rate: Option<f32>, sigma: Option<f32>, decay_function: Option<fn(f32, u32, u32) -> f64>, neighbourhood_function: Option<fn((usize, usize), (usize, usize), f32) -> Array2<f64>>) -> SOM { ... }

randomize is a flag, which, if true, initializes the weights of each cell to random, small, floating-point values.

learning_rate, optional, is the learning_rate of the SOM; by default it will be 0.5.

sigma, optional, is the spread of the neighbourhood function; by default it will be 1.0.

decay_function, optional, is a function pointer that accepts functions that take 3 parameters of types f32, u32, u32, and returns an f64. This function is used to "decay" both the learning_rate and sigma. By default it is

new_value = old_value / (1 + current_iteration/total_iterations)

neighbourhood_function, optional, is also a function pointer that accepts functions that take 3 parameters, a tuple of type (usize, usize) representing the size of the SOM, another tuple of type (usize, usize) representing the position of the winner neuron, and an f32 representing sigma; and returns a 2D Array containing weights of the neighbours of the winning neuron, i.e, centered at winner. By default, the Gaussian function will be used, which returns a "Gaussian centered at the winner neuron".


    pub fn from_json(serialized: &str,  decay_function: Option<fn(f32, u32, u32) -> f64>, neighbourhood_function: Option<fn((usize, usize), (usize, usize), f32) -> Array2<f64>>) -> serde_json::Result<SOM> { ... }

This function allows to create a SOM from a previously exported SOM json data using SOM::to_json().


Use SOM_Object.train_random() to train the SOM with the input dataset, where samples from the input dataset are picked in a random order.

pub fn train_random(&mut self, data: Array2<f64>, iterations: u32) { ... }

Samples (rows) from the 2D Array data are picked randomly and the SOM is trained for iterations iterations!


Use SOM_Object.train_batch() to train the SOM with the input dataset, where samples from the input dataset are picked in a sequential order.

pub fn train_batch(&mut self, data: Array2<f64>, iterations: u32) { ... }

Samples (rows) from the 2D Array data are picked sequentially and the SOM is trained for iterations iterations!


Use SOM_Object.winner() to find the winning neuron for a given sample.

pub fn winner(&mut self, elem: Array1<f64>) -> (usize, usize) { ... }

This function must be called with an SOM object.

Requires one parameter, a 1D Array of f64s representing the input sample.

Returns a tuple (usize, usize) representing the x and y coordinates of the winning neuron in the SOM.


Use SOM_Object.winner_dist() to find the winning neuron for a given sample, and it's distance from this winner neuron.

pub fn winner_dist(&mut self, elem: Array1<f64>) -> ((usize, usize), f64) { ... }

This function must be called with an SOM object.

Requires one parameter, a 1D Array of f64s representing the input sample.

Returns a tuple (usize, usize) representing the x and y coordinates of the winning neuron in the SOM.

Also returns an f64 representing the distance of the input sample from this winner neuron.


pub fn activation_response(&self) -> ArrayView2<usize> { ... }

This function returns the activation map of the SOM. The activation map is a 2D Array where each cell at (i, j) represents the number of times the (i, j) cell of the SOM was picked to be the winner neuron.


pub fn get_size(&self) -> (usize, usize)

This function returns a tuple representing the size of the SOM. Format is (length, breadth).


pub fn distance_map(self) -> Array2<f64> { ... }

Returns the distance map of the SOM, i.e, the normalized distance of every neuron with every other neuron.


pub fn to_json(&self) -> serde_json::Result<String> { ... }

Returns the internal SOM data as pretty printed json (using serde_json).


Primary Contributors

Aditi Srinivas
Avinash Shenoy


Example

We've tested this crate on the famous iris dataset (present in csv format in the extras folder).

The t_full_test function in /tests/test.rs was used to produce the required output. The following plots were obtained using matplotlib for Python.

Using a 5 x 5 SOM, trained for 250 iterations :

SOM1


Using a 10 x 10 SOM, trained for 1000 iterations :

SOM2

Symbol Represents
Circle setosa
Square versicolor
Diamond virginica
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].