All Projects → flxsosa → DeepHyperNEAT

flxsosa / DeepHyperNEAT

Licence: Apache-2.0 license
A public python implementation of the DeepHyperNEAT system for evolving neural networks. Developed by Felix Sosa and Kenneth Stanley. See paper here: https://eplex.cs.ucf.edu/papers/sosa_ugrad_report18.pdf

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to DeepHyperNEAT

NeuroEvolution-Flappy-Bird
A comparison between humans, neuroevolution and multilayer perceptrons playing Flapy Bird implemented in Python
Stars: ✭ 17 (-59.52%)
Mutual labels:  neat, genetic-algorithm, neuroevolution
tiny gp
Tiny Genetic Programming in Python
Stars: ✭ 58 (+38.1%)
Mutual labels:  evolution, genetic-algorithm, evolutionary-computation
neuro-evolution
A project on improving Neural Networks performance by using Genetic Algorithms.
Stars: ✭ 25 (-40.48%)
Mutual labels:  neat, genetic-algorithm, neuroevolution
NEATEST
NEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training
Stars: ✭ 13 (-69.05%)
Mutual labels:  neat, genetic-algorithm, neuroevolution
Invaderz
Space invaders, but the invaders evolve with genetic algorithm
Stars: ✭ 686 (+1533.33%)
Mutual labels:  evolution, genetic-algorithm
Ecosim
An interactive ecosystem and evolution simulator written in C and OpenGL, for GNU/Linux.
Stars: ✭ 382 (+809.52%)
Mutual labels:  evolution, genetic-algorithm
Bot Evolution
An interesting display of evolution through neural networks and a genetic algorithm
Stars: ✭ 135 (+221.43%)
Mutual labels:  evolution, genetic-algorithm
pacman-ai
A.I. plays the original 1980 Pacman using Neuroevolution of Augmenting Topologies and Deep Q Learning
Stars: ✭ 26 (-38.1%)
Mutual labels:  neat, neuroevolution
shorelark
Simulation of life & evolution
Stars: ✭ 109 (+159.52%)
Mutual labels:  evolution, genetic-algorithm
datafsm
Machine Learning Finite State Machine Models from Data with Genetic Algorithms
Stars: ✭ 14 (-66.67%)
Mutual labels:  genetic-algorithm, evolutionary-computation
Super-Meta-MarIO
Mario AI Ensemble
Stars: ✭ 15 (-64.29%)
Mutual labels:  neat, genetic-algorithm
Sharpneat
SharpNEAT - Evolution of Neural Networks. A C# .NET Framework.
Stars: ✭ 273 (+550%)
Mutual labels:  evolution, neuroevolution
dishtiny
DISHTINY: A Platform for Studying Open-Ended Evolutionary Transitions in Individuality
Stars: ✭ 25 (-40.48%)
Mutual labels:  evolution, evolutionary-computation
Darwin
Evolutionary Algorithms Framework
Stars: ✭ 72 (+71.43%)
Mutual labels:  evolution, neuroevolution
exact
EXONA: The Evolutionary eXploration of Neural Networks Framework -- EXACT, EXALT and EXAMM
Stars: ✭ 43 (+2.38%)
Mutual labels:  evolution, neuroevolution
evo-NEAT
A java implementation of NEAT(NeuroEvolution of Augmenting Topologies ) from scratch for the generation of evolving artificial neural networks. Only for educational purposes.
Stars: ✭ 34 (-19.05%)
Mutual labels:  neat, neuroevolution
evolvable
An evolutionary computation framework
Stars: ✭ 43 (+2.38%)
Mutual labels:  genetic-algorithm, evolutionary-computation
Hippocrates
No longer maintained, actually usable implementation of NEAT
Stars: ✭ 59 (+40.48%)
Mutual labels:  neat, genetic-algorithm
neat-openai-gym
NEAT for Reinforcement Learning on the OpenAI Gym
Stars: ✭ 19 (-54.76%)
Mutual labels:  neat, neuroevolution
Tensorflow-Neuroevolution
Neuroevolution Framework for Tensorflow 2.x focusing on modularity and high-performance. Preimplements NEAT, DeepNEAT, CoDeepNEAT, etc.
Stars: ✭ 109 (+159.52%)
Mutual labels:  neat, neuroevolution

Deep HyperNEAT: Extending HyperNEAT to Evolve the Architecture and Depth of Deep Networks

Maintenance made-with-python

NOTE: This implementation is under development. Updates will be pushed over time, bringing in new functionality, tests, and various other elements. The purpose of this repo is to allow others to have a codebase to understand, use, or improve upon DeepHyperNEAT.

Using DeepHyperNEAT

To run DHN in its current form, you need to create a task file. For reference, see xor_study.py.

This task file must contain:

  • Necessary imports:
     from genome import Genome # Genome class
     from population import Population # Population class
     from phenomes import FeedForwardCPPN # CPPN class
     from decode import decode # Decoder for CPPN -> Substrate
     from visualize import draw_net # optional, for visualizing networks
  • Substrate parameters
    • Input dimensions
    • Output dimensions
    • Sheet dimensions (optional)
     sub_in_dims = [1,2] # Is of type list
     sub_sh_dims = [1,3] # Is of type list
     sub_o_dims = 1 # Is of type integer
  • Evolutionary parameters
    • Population size
    • Population elitism
    • Max number of generations
     pop_key = 0 # Key for population
     pop_size = 150
     pop_elitism = 2 # Number of members of pop to keep each generation
  • The task (defined as a function in python)
    • Task parameters:
      • Task inputs
      • Expected outputs (optional)
     def task(genomes):
     	task_inputs = [1,2,3]
     	expected_outputs = [2,4,6]
     	for key, genome in genomes:
     		cppn = CPPN.create(genome) # Create cppn from genome
     		substrate = decode(cppn,sub_in_dims,sub_o_dims,sub_sh_dims) # Decode cppn into substrate
     		error = 0.0 # Initialize error for current genome
     		for inputs, expected in zip(xor_inputs, expected_outputs):
     			inputs = inputs + (1.0,) # Append inputs with bias value
     			actual_output = substrate.activate(inputs)[0] # Query substrate
     			error += error_func(actual_output,expected) # Evaluate error
     		genome.fitness = 1.0 - error # Assign fitness
  • A call to DHN to attempt to solve the task
     pop = Population(pop_key, pop_size, pop_elitism)
     solution = pop.run(task,num_generations) # Returns the solution to the task

Primary Modules

These modules are associated with the primary function of the DeepHyperNEAT (DHN) algorihtm.

genome.py

Contains all functionality of the genome, a Compositional Pattern Producing Network (CPPN) and its mutation operators.

phenomes.py

Contains multiple representations for feed-forward and recurrent neural networks for the CPPN and the Substrate.

population.py

Contains all functionality and information of the populations used in DHN.

activations.py

A library of activation functions that can be used for the CPPN and Substrate.

reproduction.py

Contains all functionality needed for the reproductive behavior in DHN.

species.py

Contains all functionality needed for speciation in DHN.

stagnation.py

Contains all functionality needed for stagnation schemes used in speciation.

decode.py

Contains all functionality needed to decode a given CPPN into a Substrate.

Secondary Modules

These modules are intended for secondary functionality such as reporting evolutionary statistics, visualizing the CPPN and Substrate, and various utility functions used throughout the primary modules.

reporters.py

Contains various functions for reporting evolutionary statistics during and after an evolutionary run.

visualize.py

Contains functions for visualizing a CPPN or Substrate.

util.py

Contains common functions and iterators used throughout DHN.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].