All Projects â†’ 100 â†’ Cranium

100 / Cranium

Licence: mit
đŸ€– A portable, header-only, artificial neural network library written in C99

Programming Languages

c
50402 projects - #5 most used programming language
c99
33 projects

Projects that are alternatives of or similar to Cranium

Yannl
Yet another neural network library
Stars: ✭ 37 (-92.61%)
Mutual labels:  matrix, classification, artificial-neural-networks, blas, portable, regression
wymlp
tiny fast portable real-time deep neural network for regression and classification within 50 LOC.
Stars: ✭ 36 (-92.81%)
Mutual labels:  embedded, portable, regression, classification
100daysofmlcode
My journey to learn and grow in the domain of Machine Learning and Artificial Intelligence by performing the #100DaysofMLCode Challenge.
Stars: ✭ 146 (-70.86%)
Mutual labels:  classification, artificial-neural-networks, regression
The Deep Learning With Keras Workshop
An Interactive Approach to Understanding Deep Learning with Keras
Stars: ✭ 34 (-93.21%)
Mutual labels:  classification, artificial-neural-networks, regression
Java Deep Learning Cookbook
Code for Java Deep Learning Cookbook
Stars: ✭ 156 (-68.86%)
Mutual labels:  classification, artificial-neural-networks, regression
Tensorflow Resources
Curated Tensorflow code resources to help you get started with Deep Learning.
Stars: ✭ 330 (-34.13%)
Mutual labels:  classification, artificial-neural-networks, regression
Neuroflow
Artificial Neural Networks for Scala
Stars: ✭ 105 (-79.04%)
Mutual labels:  classification, artificial-neural-networks, regression
projection-pursuit
An implementation of multivariate projection pursuit regression and univariate classification
Stars: ✭ 24 (-95.21%)
Mutual labels:  travis-ci, regression, classification
EmbeddedML
EmbeddedML was created to be an alternative to the limited options available for Artificial Neural Networks in C. It is designed to be efficient without sacrificing ease of use. It is meant to support students as well as industry experts as it is built to be expandable and straightforward to manipulate.
Stars: ✭ 24 (-95.21%)
Mutual labels:  embedded, artificial-neural-networks
Synthetic-data-gen
Various methods for generating synthetic data for data science and ML
Stars: ✭ 57 (-88.62%)
Mutual labels:  regression, classification
Tensorflow Book
Accompanying source code for Machine Learning with TensorFlow. Refer to the book for step-by-step explanations.
Stars: ✭ 4,448 (+787.82%)
Mutual labels:  classification, regression
pywedge
Makes Interactive Chart Widget, Cleans raw data, Runs baseline models, Interactive hyperparameter tuning & tracking
Stars: ✭ 49 (-90.22%)
Mutual labels:  regression, classification
drupal9ci
One-line installers for implementing Continuous Integration in Drupal 9
Stars: ✭ 137 (-72.65%)
Mutual labels:  travis-ci, continuous-integration
Minimalistic-Multiple-Layer-Neural-Network-from-Scratch-in-Python
Minimalistic Multiple Layer Neural Network from Scratch in Python.
Stars: ✭ 24 (-95.21%)
Mutual labels:  regression, classification
monolish
monolish: MONOlithic LInear equation Solvers for Highly-parallel architecture
Stars: ✭ 166 (-66.87%)
Mutual labels:  matrix, blas
R
All Algorithms implemented in R
Stars: ✭ 294 (-41.32%)
Mutual labels:  classification, regression
Mlr3
mlr3: Machine Learning in R - next generation
Stars: ✭ 463 (-7.58%)
Mutual labels:  classification, regression
sabotage
a radical and experimental distribution based on musl libc and busybox
Stars: ✭ 502 (+0.2%)
Mutual labels:  embedded, efficient
Fuku Ml
Simple machine learning library / ç°Ąć–źæ˜“ç”šçš„æ©Ÿć™šć­žçż’ć„—ä»¶
Stars: ✭ 280 (-44.11%)
Mutual labels:  classification, regression
Pycaret
An open-source, low-code machine learning library in Python
Stars: ✭ 4,594 (+816.97%)
Mutual labels:  regression, classification

Build Status MIT License

Cranium is a portable, header-only, feedforward artificial neural network library written in vanilla C99.

It supports fully-connected networks of arbitrary depth and structure, and should be reasonably fast as it uses a matrix-based approach to calculations. It is particularly suitable for low-resource machines or environments in which additional dependencies cannot be installed.

Cranium supports CBLAS integration. Simply uncomment line 7 in matrix.h to enable the BLAS sgemm function for fast matrix multiplication.

Check out the detailed documentation here for information on individual structures and functions.


Features

  • Activation functions
    • sigmoid
    • ReLU
    • tanh
    • softmax (classification)
    • linear (regression)
  • Loss functions
    • Cross-entropy loss (classification)
    • Mean squared error (regression)
  • Optimization algorithms
    • Batch Gradient Descent
    • Stochastic Gradient Descent
    • Mini-Batch Stochastic Gradient Descent
  • L2 Regularization
  • Learning rate annealing
  • Simple momentum
  • Fan-in weight initialization
  • CBLAS support for fast matrix multiplication
  • Serializable networks

Usage

Since Cranium is header-only, simply copy the src directory into your project, and #include "src/cranium.h" to begin using it.

Its only required compiler dependency is from the <math.h> header, so compile with -lm.

If you are using CBLAS, you will also need to compile with -lcblas and include, via -I, the path to wherever your particular machine's BLAS implementation is. Common ones include OpenBLAS and ATLAS.

It has been tested to work perfectly fine with any level of gcc optimization, so feel free to use them.


Example

#include "cranium.h"

/*
This basic example program is the skeleton of a classification problem.
The training data should be in matrix form, where each row is a data point, and
    each column is a feature. 
The training classes should be in matrix form, where the ith row corresponds to
    the ith training example, and each column is a 1 if it is of that class, and
    0 otherwise. Each example may only be of 1 class.
*/

// create training data and target values (data collection not shown)
int rows, features, classes;
float** training;
float** classes;

// create datasets to hold the data
DataSet* trainingData = createDataSet(rows, features, training);
DataSet* trainingClasses = createDataSet(rows, classes, classes);

// create network with 2 input neurons, 1 hidden layer with sigmoid
// activation function and 5 neurons, and 2 output neurons with softmax 
// activation function
srand(time(NULL));
size_t hiddenSize[] = {5};
Activation hiddenActivation[] = {sigmoid};
Network* net = createNetwork(2, 1, hiddenSize, hiddenActivation, 2, softmax);

// train network with cross-entropy loss using Mini-Batch SGD
ParameterSet params;
params.network = net;
params.data = trainingData;
params.classes = trainingClasses;
params.lossFunction = CROSS_ENTROPY_LOSS;
params.batchSize = 20;
params.learningRate = .01;
params.searchTime = 5000;
params.regularizationStrength = .001;
params.momentumFactor = .9;
params.maxIters = 10000;
params.shuffle = 1;
params.verbose = 1;
optimize(params);

// test accuracy of network after training
printf("Accuracy is %f\n", accuracy(net, trainingData, trainingClasses));

// get network's predictions on input data after training
forwardPass(net, trainingData);
int* predictions = predict(net);
free(predictions);

// save network to a file
saveNetwork(net, "network");

// free network and data
destroyNetwork(net);
destroyDataSet(trainingData);
destroyDataSet(trainingClasses);

// load previous network from file
Network* previousNet = readNetwork("network");
destroyNetwork(previousNet);

Building and Testing

To run tests, look in the tests folder.

The Makefile has commands to run each batch of unit tests, or all of them at once.


Contributing

Feel free to send a pull request if you want to add any features or if you find a bug.

Check the issues tab for some potential things to do.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].