All Projects → henrygouk → dopt

henrygouk / dopt

Licence: BSD-3-Clause license
A numerical optimisation and deep learning framework for D.

Programming Languages

d
599 projects

Projects that are alternatives of or similar to dopt

Owl
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
Stars: ✭ 919 (+3182.14%)
Mutual labels:  optimization, automatic-differentiation
autodiff
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
Stars: ✭ 69 (+146.43%)
Mutual labels:  optimization, automatic-differentiation
Galacticoptim.jl
Local, global, and beyond optimization for scientific machine learning (SciML)
Stars: ✭ 155 (+453.57%)
Mutual labels:  optimization, automatic-differentiation
Pennylane
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
Stars: ✭ 800 (+2757.14%)
Mutual labels:  optimization, automatic-differentiation
AbstractOperators.jl
Abstract operators for large scale optimization in Julia
Stars: ✭ 26 (-7.14%)
Mutual labels:  optimization, automatic-differentiation
Adcme.jl
Automatic Differentiation Library for Computational and Mathematical Engineering
Stars: ✭ 106 (+278.57%)
Mutual labels:  optimization, automatic-differentiation
Aerosandbox
Aircraft design optimization made fast through modern automatic differentiation. Plug-and-play analysis tools for aerodynamics, propulsion, structures, trajectory design, and much, much more.
Stars: ✭ 193 (+589.29%)
Mutual labels:  optimization, automatic-differentiation
PyDE
Differential evolution global optimization in Python.
Stars: ✭ 28 (+0%)
Mutual labels:  optimization
DotNet.SystemCollections.Analyzers
A set of code analyzers & code fix providers to help developers use the proper .NET Collection & API in their algorithms
Stars: ✭ 72 (+157.14%)
Mutual labels:  optimization
optimization
Routing optimization module module for Itinero.
Stars: ✭ 47 (+67.86%)
Mutual labels:  optimization
gams.jl
A MathOptInterface Optimizer to solve JuMP models using GAMS
Stars: ✭ 27 (-3.57%)
Mutual labels:  optimization
osprey
🦅Hyperparameter optimization for machine learning pipelines 🦅
Stars: ✭ 71 (+153.57%)
Mutual labels:  optimization
noisyopt
Python library for optimizing noisy functions.
Stars: ✭ 73 (+160.71%)
Mutual labels:  optimization
structural-imbalance
Demo for analyzing the structural imbalance on a signed social network.
Stars: ✭ 22 (-21.43%)
Mutual labels:  optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (+232.14%)
Mutual labels:  optimization
RcppNumerical
Rcpp Integration for Numerical Computing Libraries
Stars: ✭ 52 (+85.71%)
Mutual labels:  optimization
FwiFlow.jl
Elastic Full Waveform Inversion for Subsurface Flow Problems Using Intrusive Automatic Differentiation
Stars: ✭ 24 (-14.29%)
Mutual labels:  automatic-differentiation
SumOfSquares.jl
Sum of Squares Programming for Julia
Stars: ✭ 96 (+242.86%)
Mutual labels:  optimization
YaoBlocks.jl
Standard basic quantum circuit simulator building blocks. (archived, for it is moved to Yao.jl)
Stars: ✭ 26 (-7.14%)
Mutual labels:  automatic-differentiation
RazorHtmlMinifier.Mvc5
↘️ Trivial compile-time Razor HTML Minifier for ASP.NET MVC 5.
Stars: ✭ 31 (+10.71%)
Mutual labels:  optimization

dopt

DUB Travis-CI

A numerical optimisation and deep learning framework for D.

Current features include:

  • Ability to construct symbolic representations of tensor-valued functions
  • Basic arithmetic and mathematical operations (add, sub, mul, div, abs, log, exp, ...)
  • Basic matrix operations (multiplication, transpose)
  • Reverse-mode automatic differentiation
  • Neural network primitives
  • Neural network construction utilities
  • Several prebuilt models (VGG, Wide ResNet)
  • Framework to add third party operations and their derivatives, and the ability register implementations for both the CPU and CUDA backends
  • Online optimisation algorithms: SGD, ADAM, AMSGrad, and more to come!

The project is still in the early stages, and some things might not work properly yet. Some planned future features include:

  • The ability to add optimisation passes to the CPU and CUDA backends
  • More utilities for training deep networks (data loaders, standard training loops, etc)

Docs

Documentation can be found here. A brief outline of how to use this framework for deep learning is provided here.

Using

The easiest way to use dopt is by adding it as a dependency in your project's dub configuration file. See dub's getting started page for more information about how to do this.

If you want to take advantage of the CUDA backend (currently required for most neural network operations) then you should also ensure that the cuda configuration is used. This is the sort of thing you would end up putting in your dub.json file:

"dependencies": {
    "dopt": "~>0.3.17"
},
"subConfigurations": {
    "dopt": "cuda"
}

Example

Examples for training networks on MNIST, CIFAR-10, CIFAR-100, and SINS-10 are given in the examples/ folder.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].