All Projects → RootHarold → Lycoris

RootHarold / Lycoris

Licence: lgpl-3.0
A lightweight and easy-to-use deep learning framework with neural architecture search.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Lycoris

Deep architect
A general, modular, and programmable architecture search framework
Stars: ✭ 110 (-38.89%)
Mutual labels:  neural-architecture-search
Nas Segm Pytorch
Code for Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells, CVPR '19
Stars: ✭ 126 (-30%)
Mutual labels:  neural-architecture-search
Deep architect legacy
DeepArchitect: Automatically Designing and Training Deep Architectures
Stars: ✭ 144 (-20%)
Mutual labels:  neural-architecture-search
River
🌊 Online machine learning in Python
Stars: ✭ 2,980 (+1555.56%)
Mutual labels:  online-learning
Phpid Learning
🙋 Belajar daring bersama PHPID
Stars: ✭ 125 (-30.56%)
Mutual labels:  online-learning
Continuum
A clean and simple data loading library for Continual Learning
Stars: ✭ 136 (-24.44%)
Mutual labels:  online-learning
Pnasnet.tf
TensorFlow implementation of PNASNet-5 on ImageNet
Stars: ✭ 102 (-43.33%)
Mutual labels:  neural-architecture-search
Nmflibrary
MATLAB library for non-negative matrix factorization (NMF): Version 1.8.1
Stars: ✭ 153 (-15%)
Mutual labels:  online-learning
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-30%)
Mutual labels:  neural-architecture-search
Scarlet Nas
Bridging the gap Between Stability and Scalability in Neural Architecture Search
Stars: ✭ 140 (-22.22%)
Mutual labels:  neural-architecture-search
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-35%)
Mutual labels:  neural-architecture-search
Nasbot
Neural Architecture Search with Bayesian Optimisation and Optimal Transport
Stars: ✭ 120 (-33.33%)
Mutual labels:  neural-architecture-search
Single Path One Shot Nas Mxnet
Single Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
Stars: ✭ 136 (-24.44%)
Mutual labels:  neural-architecture-search
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-37.78%)
Mutual labels:  neural-architecture-search
Dna
Block-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (-18.33%)
Mutual labels:  neural-architecture-search
Graphnas
This directory contains code necessary to run the GraphNAS algorithm.
Stars: ✭ 104 (-42.22%)
Mutual labels:  neural-architecture-search
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+910.56%)
Mutual labels:  neural-architecture-search
Nsga Net
NSGA-Net, a Neural Architecture Search Algorithm
Stars: ✭ 171 (-5%)
Mutual labels:  neural-architecture-search
Aw nas
aw_nas: A Modularized and Extensible NAS Framework
Stars: ✭ 152 (-15.56%)
Mutual labels:  neural-architecture-search
Sgas
SGAS: Sequential Greedy Architecture Search (CVPR'2020) https://www.deepgcns.org/auto/sgas
Stars: ✭ 137 (-23.89%)
Mutual labels:  neural-architecture-search

logo

Lycoris is a lightweight and easy-to-use deep learning framework with neural architecture search.

Lycoris aims to provide developers with an automated service for end-to-end neural network architecture search, enabling developers to obtain better models with fewer hyperparameter configurations. For more detailed usage of Lycoris, please refer to related documents.

At this stage, Lycoris enables you to deploy computation to one or more CPUs in most operating systems. In the future, GPUs will be added to the minimalist architecture.

Features

  • Lightweight and portable to smart devices.
  • Support online learning.
  • Automatic neural network architecture search.
  • Support for Python. In the near future, other programming languages such as Java, C #, Go, and Rust will be supported.
  • Cloud-friendly. Based on C++11, it supports most operating systems and compilers.

Installation

  • C++ version:
cd Lycoris
cmake .
sudo make install
  • Python bindings:

The Python bindings of the code are based on Pybind11.

git clone "https://github.com/pybind/pybind11.git"
cd pybind11
mkdir build
cd build
cmake ..
make install

(If pybind11 and its header files are already installed, you can ignore the above steps.)

pip install LycorisNet

It can also be obtained via manual compilation:

cd Lycoris/python
cmake .
make

Documents

The following is the documentation for the C ++ version while that of the Python version can be viewed here.

The APIs provided by Lycoris (namespace LycorisNet):

Function Description Inputs Returns
Lycoris(uint32_t capacity, uint32_t inputDim, uint32_t outputDim, const std::string &mode); Constructor.
The class Lycoris is the highest level abstraction of LycorisNet.
capacity: Capacity of Lycoris.
inputDim: Input dimension.
outputDim: Output dimension.
mode: Mode of Lycoris (classify or predict).
An object of the class Lycoris.
~Lycoris(); Destructor.
void preheat(uint32_t num_of_nodes, uint32_t num_of_connections, uint32_t depth); Preheating process of the neural network cluster. num_of_nodes: The number of hidden nodes added for each neural network.
num_of_connections: The number of connections added for each neural network.
depth: Total layers of each neural network.
void evolve(std::vector<std::vector<float> > &input, std::vector<std::vector<float> > &desire); Evolve the neural network cluster. input: Input data.
desire: Expected output data.
void fit(std::vector<std::vector<float> > &input, std::vector<std::vector<float> > &desire); Fit all neural networks in the neural network cluster. input: Input data.
desire: Expected output data.
void enrich(); Keep only the best one in the neural network cluster.
std::vector<float> compute(std::vector<float> &input); Forward Computing of the best individual. input: Input data. Returns the output data.
std::vector<std::vector<float> > computeBatch(std::vector<std::vector<float> > &input); Parallel forward Computing of the best individual. input: Input data (two dimensions). Returns the output data (two dimensions).
void resize(uint32_t capacity); Resize the capacity of the neural network cluster. As literally.
void openMemLimit(uint32_t size); Turn on memory-limit. As literally.
void closeMemLimit(); Turn off memory-limit.
void saveModel(const std::string &path); Export the current trained model. path: File path of the current trained model.
void setMutateArgs(std::vector<float> &p); Set p1 to p4 in the class Args.
Parameters are passed in as std::vector.
p1: Probability of adding the new node between a connection.
p2: Probability of deleting a node.
p3: Probability of adding a new connection between two nodes.
p4: Probability of deleting a connection.
void setMutateOdds(float odds); Set the odds of mutating. The param "odds" means one individual mutates odds times to form odds + 1 individuals.
void setCpuCores(uint32_t num); Set the number of worker threads to train the model. As literally.
void setLR(float lr); Set the learning rate. As literally.
uint32_t getSize(); Returns the size of the best individual.
uint32_t getInputDim(); Returns the input dimension.
uint32_t getOutputDim(); Returns the output dimension.
uint32_t getCapacity(); Returns capacity of Lycoris.
float getLoss(); Returns the loss.
std::string getMode(); Returns mode of Lycoris (classify or predict).
std::vector<uint32_t> getLayers(); Returns the number of nodes in each layer of the neural network.
std::vector<float> getHiddenLayer(uint32_t pos); The parameter pos starts at index 0. pos: The number of the layer needed. Returns a vector of nodes in a specific layer of the best individual.
static std::string version(); Returns version information and copyright information.

The funtion used to import the pre-trained model (namespace LycorisNet):

Function Description Inputs Returns
Lycoris *loadModel(const std::string &path, uint32_t capacity); Import the pre-trained model. path: File path of the pre-trained model.
capacity: Capacity of the neural network cluster.
Returns a pointer to the object of class Lycoris.
Lycoris *loadViaString(const std::string &model, uint32_t capacity); Import the pre-trained model via string. model: The pre-trained model in the form of string.
capacity: Capacity of the neural network cluster.
Returns a pointer to the object of class Lycoris.

More detailed documentation will be released in the form of sample code.

Examples

  • LycorisAD: an elegant outlier detection algorithm framework based on AutoEncoder.
  • LycorisR: a lightweight recommendation algorithm framework based on LycorisNet.
  • LycorisQ: a neat reinforcement learning framework based on LycorisNet.
  • More examples will be released in the future.

License

Lycoris is released under the LGPL-3.0 license. By using, distributing, or contributing to this project, you agree to the terms and conditions of this license.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].