All Projects → hugorut → Neural Cli

hugorut / Neural Cli

command line neural network

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Neural Cli

Mlbox
MLBox is a powerful Automated Machine Learning python library.
Stars: ✭ 1,199 (+762.59%)
Mutual labels:  prediction
Bitcoin Value Predictor
[NOT MAINTAINED] Predicting Bit coin price using Time series analysis and sentiment analysis of tweets on bitcoin
Stars: ✭ 91 (-34.53%)
Mutual labels:  prediction
Seldon Server
Machine Learning Platform and Recommendation Engine built on Kubernetes
Stars: ✭ 1,435 (+932.37%)
Mutual labels:  prediction
Dspn
[NeurIPS 2019] Deep Set Prediction Networks
Stars: ✭ 80 (-42.45%)
Mutual labels:  prediction
Flink Jpmml
flink-jpmml is a fresh-made library for dynamic real time machine learning predictions built on top of PMML standard models and Apache Flink streaming engine
Stars: ✭ 88 (-36.69%)
Mutual labels:  prediction
Keras Timeseries Prediction
Time series prediction with Sequential Model and LSTM units
Stars: ✭ 100 (-28.06%)
Mutual labels:  prediction
Deep Segmentation
CNNs for semantic segmentation using Keras library
Stars: ✭ 69 (-50.36%)
Mutual labels:  prediction
Machinelearning
An easy neural network for Java!
Stars: ✭ 122 (-12.23%)
Mutual labels:  prediction
Tracknpred
This is the code base for our ACM CSCS 2019 paper: "RobustTP: End-to-End Trajectory Prediction for Heterogeneous Road-Agents in Dense Traffic with Noisy Sensor Inputs". This codebase contains implementations for several trajectory prediction methods including Social-GAN and TraPHic.
Stars: ✭ 88 (-36.69%)
Mutual labels:  prediction
Tennis Crystal Ball
Ultimate Tennis Statistics and Tennis Crystal Ball - Tennis Big Data Analysis and Prediction
Stars: ✭ 107 (-23.02%)
Mutual labels:  prediction
Ergo
A Python library for integrating model-based and judgmental forecasting
Stars: ✭ 82 (-41.01%)
Mutual labels:  prediction
Stgcn
implementation of STGCN for traffic prediction in IJCAI2018
Stars: ✭ 87 (-37.41%)
Mutual labels:  prediction
Pylot
Modular autonomous driving platform running on the CARLA simulator and real-world vehicles.
Stars: ✭ 104 (-25.18%)
Mutual labels:  prediction
Ai Reading Materials
Some of the ML and DL related reading materials, research papers that I've read
Stars: ✭ 79 (-43.17%)
Mutual labels:  prediction
Mind
A neural network library built in JavaScript
Stars: ✭ 1,466 (+954.68%)
Mutual labels:  prediction
Mirdeep2
Discovering known and novel miRNAs from small RNA sequencing data
Stars: ✭ 70 (-49.64%)
Mutual labels:  prediction
Btctrading
Time Series Forecast with Bitcoin value, to detect upward/down trends with Machine Learning Algorithms
Stars: ✭ 99 (-28.78%)
Mutual labels:  prediction
Augustus
Genome annotation with AUGUSTUS
Stars: ✭ 129 (-7.19%)
Mutual labels:  prediction
Bulbea
🐗 🐻 Deep Learning based Python Library for Stock Market Prediction and Modelling
Stars: ✭ 1,585 (+1040.29%)
Mutual labels:  prediction
Skpro
Supervised domain-agnostic prediction framework for probabilistic modelling
Stars: ✭ 107 (-23.02%)
Mutual labels:  prediction

Command Line Neural Network

Neuralcli provides a simple command line interface to a python implementation of a simple classification neural network. Neuralcli allows a quick way and easy to get instant feedback on a hypothesis or to play around with one of the most popular concepts in machine learning today.

Installation

Installation of neuralcli is provided through pip, just run:

pip install neuralcli

If you don't have some of the libraries used, such as numpy or skitlearn the install make take some time as pip installs all the dependencies. After pip finishes the install run the following command -v neuralcli to check that the executable has been successfully added.

Troubleshooting

When you run neuralcli for the first time you may get an output similar to below

/usr/local/lib/python2.7/site-packages/matplotlib/font_manager.py:273: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')

This is just a warning from matplotlib, and will be removed the next time you run the command.

Additionally matlibplot may throw another error that will produce an output similar to:

**RuntimeError**: Python is not installed as a framework

To fix this issue follow the steps outlined here

Use

Neuralcli comes bundled with three main commands.

Train

The train command takes a set of input features along with their expected outputs and performs backpropogation to learn the weights for a neural network. These weights can be saved to an output file to use for classification prediction later. The command takes the following.

parameters:

name type description example
X file a file path to a CSV which holds your training data ./train.csv
Y file a file path to a CSV which holds your expected outputs for the training examples ./expected.csv

flags:

name type description default example
--lam float The regularization amount 1 0.07
--maxiter int The maximum iterations for chosen to minimise the cost function 250 30
--output string A file path to save the minimised parameters to nil ./output.csv
--normalize bool Perform normalization on the training set true false
--verbose bool Output the training progress true false

example:

$ neuralcli train ./X.csv ./Y.csv --output=./weights.csv --normalize=true

Once you run the train command the neural network will intialize and begin to learn the weights, you should see an output similar to bellow if the --verbose flag is set to true.

Predict

The prediction command takes a set of learned weights and a given input to predict a an ouput. The learned weights are loaded into the neural network by providing an file which holds them in a rolled 1 * n vector shape. In order for the predict command to work correctly these parameters need to be unrolled and therefore you need to provide the sizes of the input layer, hidden layer, and output labels that you wish to unroll the

parameters:

name type description example
x file the file that holds the 1 * n row example that should be predicted ./input.csv
params file The file that holds a 1 * n rolled parameter vector (saved from the train command) ./ouput.csv
labels int The size of the output layer that the parameters were trained on 3

flags:

name type description default example
--normalize bool Perform normalization on the training set true false
--sizeh int The size of the hidden layer if it differs from the input layer nil 8

example:

$ neuralcli predict ./x.csv 3 ./params.csv 

Neuralcli will now print a prediction in INT form, corresponding to the index of you output labels. e.g. 0 will correspond to you first classification label.

Test

The test command gives some primitive feedback about the correctness of your hypothesis by running a diagnostic check on the given data set and expected output. This method plots the the margin of prediction error against the increase in size of training examples. This can be useful to determine what is going wrong with your hypothesis, i.e. whether it is underfitting or overfitting the training set.

parameters:

name type description example
X file a file path to a CSV which holds your training data ./train.csv
Y file a file path to a CSV which holds your expected outputs for the training examples ./expected.csv

flags:

name type description default example
--lam float The regularization amount 1 0.07
--maxiter int The maximum iterations for chosen to minimise the cost function 250 30
--normalize bool Perform normalization on the training set true false
--verbose bool Output the training progress true false
--step int The increments that the training will increase the set by 10 100

example:

$ neuralcli train ./X.csv ./Y.csv --step=50 --normalize=true

Neural cli will then run the test sequence printing its progress as it increases the size of the training set.

After this runs it will then print a plot of the hypothesis error against the size of training set the weights where learned on. Below is an example graph plotted from the iris dataset.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].