All Projects → fillassuncao → denser-models

fillassuncao / denser-models

Licence: LGPL-3.0 license
cdv.dei.uc.pt/denser/

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to denser-models

DeepHyperNEAT
A public python implementation of the DeepHyperNEAT system for evolving neural networks. Developed by Felix Sosa and Kenneth Stanley. See paper here: https://eplex.cs.ucf.edu/papers/sosa_ugrad_report18.pdf
Stars: ✭ 42 (-14.29%)
Mutual labels:  neuroevolution, evolutionary-computation
Neatron
Yet another NEAT implementation
Stars: ✭ 14 (-71.43%)
Mutual labels:  neuroevolution
Neural Network P5
Deprecated! See:
Stars: ✭ 218 (+344.9%)
Mutual labels:  neuroevolution
Tensorflow-Neuroevolution
Neuroevolution Framework for Tensorflow 2.x focusing on modularity and high-performance. Preimplements NEAT, DeepNEAT, CoDeepNEAT, etc.
Stars: ✭ 109 (+122.45%)
Mutual labels:  neuroevolution
datafsm
Machine Learning Finite State Machine Models from Data with Genetic Algorithms
Stars: ✭ 14 (-71.43%)
Mutual labels:  evolutionary-computation
exact
EXONA: The Evolutionary eXploration of Neural Networks Framework -- EXACT, EXALT and EXAMM
Stars: ✭ 43 (-12.24%)
Mutual labels:  neuroevolution
Sparse Evolutionary Artificial Neural Networks
Always sparse. Never dense. But never say never. A repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
Stars: ✭ 182 (+271.43%)
Mutual labels:  neuroevolution
neat-python
Python implementation of the NEAT neuroevolution algorithm
Stars: ✭ 32 (-34.69%)
Mutual labels:  neuroevolution
Harris-Hawks-Optimization-Algorithm-and-Applications
Source codes for HHO paper: Harris hawks optimization: Algorithm and applications: https://www.sciencedirect.com/science/article/pii/S0167739X18313530. In this paper, a novel population-based, nature-inspired optimization paradigm is proposed, which is called Harris Hawks Optimizer (HHO).
Stars: ✭ 31 (-36.73%)
Mutual labels:  evolutionary-computation
neat-openai-gym
NEAT for Reinforcement Learning on the OpenAI Gym
Stars: ✭ 19 (-61.22%)
Mutual labels:  neuroevolution
evolvable
An evolutionary computation framework
Stars: ✭ 43 (-12.24%)
Mutual labels:  evolutionary-computation
pacman-ai
A.I. plays the original 1980 Pacman using Neuroevolution of Augmenting Topologies and Deep Q Learning
Stars: ✭ 26 (-46.94%)
Mutual labels:  neuroevolution
evo-NEAT
A java implementation of NEAT(NeuroEvolution of Augmenting Topologies ) from scratch for the generation of evolving artificial neural networks. Only for educational purposes.
Stars: ✭ 34 (-30.61%)
Mutual labels:  neuroevolution
rustneat
Rust Neat - NeuroEvolution of Augmenting Topologies
Stars: ✭ 63 (+28.57%)
Mutual labels:  neuroevolution
es pytorch
High performance implementation of Deep neuroevolution in pytorch using mpi4py. Intended for use on HPC clusters
Stars: ✭ 20 (-59.18%)
Mutual labels:  neuroevolution
Aimandshoot
A neuroevolution game experiment.
Stars: ✭ 201 (+310.2%)
Mutual labels:  neuroevolution
neuro-evolution
A project on improving Neural Networks performance by using Genetic Algorithms.
Stars: ✭ 25 (-48.98%)
Mutual labels:  neuroevolution
NeuroEvolution-Flappy-Bird
A comparison between humans, neuroevolution and multilayer perceptrons playing Flapy Bird implemented in Python
Stars: ✭ 17 (-65.31%)
Mutual labels:  neuroevolution
evoplex
Evoplex is a fast, robust and extensible platform for developing agent-based models and multi-agent systems on networks. It's available for Windows, Linux and macOS.
Stars: ✭ 98 (+100%)
Mutual labels:  evolutionary-computation
NeuralFish
Neuroevolution in F#
Stars: ✭ 28 (-42.86%)
Mutual labels:  neuroevolution

DENSER: Deep Evolutionary Network Structured Representation

Deep Evolutionary Network Structured Representation (DENSER) is a novel approach to automatically design Artificial Neural Networks (ANNs) using Evolutionary Computation. The algorithm not only searches for the best network topology, but also tunes hyper-parameters (e.g., learning or data augmentation parameters). The automatic design is achieved using a representation with two distinct levels, where the outer level encodes the general structure of the network, and the inner level encodes the parameters associated with each layer. The allowed layers and hyper-parameter value ranges are defined by means of a human-readable Context-Free Grammar. If you use this code, a reference to one of the following works would be greatly appreciated:

@inproceedings{assuncao2018evolving,
	title={Evolving the Topology of Large Scale Deep Neural Networks},
	author={Assun{\c{c}}ao, Filipe and Louren{\c{c}}o, Nuno and Machado, Penousal and Ribeiro, Bernardete},
	booktitle={European Conference on Genetic Programming (EuroGP)},
	year={2018},
	organization={Springer}
}

@article{assuncao2018denser,
	title={DENSER: Deep Evolutionary Network Structured Representation},
	author={Assun{\c{c}}ao, Filipe and Louren{\c{c}}o, Nuno and Machado, Penousal and Ribeiro, Bernardete},
	journal={arXiv preprint arXiv:1801.01563},
	year={2018}
}

Requirements

Currently this codebase only works with python 2. The following libraries are needed: keras, numpy, and sklearn.

Instalation

Unzip the datasets that are zipped

python denser_models -d [dataset] -m

-d can assume one of the following values: cifar-10, cifar-100, mnist, mnist-rotated, mnist-background, mnist-rotated-background, fashion-mnist, rectangles, or rectangles-background

-m is an option that forms the classifier as an ensemble of the two best models

Support

Any questions, comments or suggestion should be directed to Filipe Assunção ([email protected])

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].