All Projects → locuslab → Optnet

locuslab / Optnet

Licence: apache-2.0
OptNet: Differentiable Optimization as a Layer in Neural Networks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Optnet

E2e Model Learning
Task-based end-to-end model learning in stochastic optimization
Stars: ✭ 140 (-61.22%)
Mutual labels:  paper, optimization
Awesome System For Machine Learning
A curated list of research in machine learning system. I also summarize some papers if I think they are really interesting.
Stars: ✭ 1,185 (+228.25%)
Mutual labels:  paper, optimization
Languagedetector
PHP Class to detect languages from any free text
Stars: ✭ 317 (-12.19%)
Mutual labels:  paper
Tg Reading List
A text generation reading list maintained by Tsinghua Natural Language Processing Group.
Stars: ✭ 352 (-2.49%)
Mutual labels:  paper
Pymanopt
Python toolbox for optimization on Riemannian manifolds with support for automatic differentiation
Stars: ✭ 344 (-4.71%)
Mutual labels:  optimization
Structurae
Data structures for high-performance JavaScript applications.
Stars: ✭ 323 (-10.53%)
Mutual labels:  optimization
Weightnorm
Example code for Weight Normalization, from "Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks"
Stars: ✭ 347 (-3.88%)
Mutual labels:  paper
Paper
High performance Spigot fork that aims to fix gameplay and mechanics inconsistencies
Stars: ✭ 5,293 (+1366.2%)
Mutual labels:  paper
Multipages Generator
🥇 generator for multiple pages webpack application
Stars: ✭ 354 (-1.94%)
Mutual labels:  optimization
Experiments with python
experiments with python
Stars: ✭ 342 (-5.26%)
Mutual labels:  optimization
Inception V4
Inception-v4, Inception - Resnet-v1 and v2 Architectures in Keras
Stars: ✭ 350 (-3.05%)
Mutual labels:  paper
Ojalgo
oj! Algorithms
Stars: ✭ 336 (-6.93%)
Mutual labels:  optimization
Tick
Module for statistical learning, with a particular emphasis on time-dependent modelling
Stars: ✭ 326 (-9.7%)
Mutual labels:  optimization
Bestofml
The best resources around Machine Learning
Stars: ✭ 349 (-3.32%)
Mutual labels:  paper
Cvxpy
A Python-embedded modeling language for convex optimization problems.
Stars: ✭ 3,645 (+909.7%)
Mutual labels:  optimization
Healthcare ml
A curated list of ML|NLP resources for healthcare.
Stars: ✭ 351 (-2.77%)
Mutual labels:  paper
React Scope
Visualize your React components as you interact with your application.
Stars: ✭ 316 (-12.47%)
Mutual labels:  optimization
Zoopt
A python package of Zeroth-Order Optimization (ZOOpt)
Stars: ✭ 331 (-8.31%)
Mutual labels:  optimization
Cyclegan
Tensorflow implementation of CycleGAN
Stars: ✭ 348 (-3.6%)
Mutual labels:  paper
Vsepp
PyTorch Code for the paper "VSE++: Improving Visual-Semantic Embeddings with Hard Negatives"
Stars: ✭ 354 (-1.94%)
Mutual labels:  paper

OptNet: Differentiable Optimization as a Layer in Neural Networks

This repository is by Brandon Amos and J. Zico Kolter and contains the PyTorch source code to reproduce the experiments in our ICML 2017 paper OptNet: Differentiable Optimization as a Layer in Neural Networks.

If you find this repository helpful in your publications, please consider citing our paper.

@InProceedings{amos2017optnet,
  title = {{O}pt{N}et: Differentiable Optimization as a Layer in Neural Networks},
  author = {Brandon Amos and J. Zico Kolter},
  booktitle = {Proceedings of the 34th International Conference on Machine Learning},
  pages = {136--145},
  year = {2017},
  volume = {70},
  series = {Proceedings of Machine Learning Research},
  publisher ={PMLR},
}

Informal Introduction

Mathematical optimization is a well-studied language of expressing solutions to many real-life problems that come up in machine learning and many other fields such as mechanics, economics, EE, operations research, control engineering, geophysics, and molecular modeling. As we build our machine learning systems to interact with real data from these fields, we often cannot (but sometimes can) simply ``learn away'' the optimization sub-problems by adding more layers in our network. Well-defined optimization problems may be added if you have a thorough understanding of your feature space, but oftentimes we don't have this understanding and resort to automatic feature learning for our tasks.

Until this repository, no modern deep learning library has provided a way of adding a learnable optimization layer (other than simply unrolling an optimization procedure, which is inefficient and inexact) into our model formulation that we can quickly try to see if it's a nice way of expressing our data.

See our paper OptNet: Differentiable Optimization as a Layer in Neural Networks and code at locuslab/optnet if you are interested in learning more about our initial exploration in this space of automatically learning quadratic program layers for signal denoising and sudoku.

Setup and Dependencies

  • Python/numpy/PyTorch
  • qpth: Our fast QP solver for PyTorch released in conjunction with this paper.
  • bamos/block: Our intelligent block matrix library for numpy, PyTorch, and beyond.
  • Optional: bamos/setGPU: A small library to set CUDA_VISIBLE_DEVICES on multi-GPU systems.

Denoising Experiments

denoising
├── create.py - Script to create the denoising dataset.
├── plot.py - Plot the results from any experiment.
├── main.py - Run the FC baseline and OptNet denoising experiments. (See arguments.)
├── main.tv.py - Run the TV baseline denoising experiment.
└── run-exps.sh - Run all experiments. (May need to uncomment some lines.)

Sudoku Experiments

  • The dataset we used in our experiments is available in sudoku/data.
sudoku
├── create.py - Script to create the dataset.
├── plot.py - Plot the results from any experiment.
├── main.py - Run the FC baseline and OptNet Sudoku experiments. (See arguments.)
└── models.py - Models used for Sudoku.

Classification Experiments

cls
├── train.py - Run the FC baseline and OptNet classification experiments. (See arguments.)
├── plot.py - Plot the results from any experiment.
└── models.py - Models used for classification.

Acknowledgments

The rapid development of this work would not have been possible without the immense amount of help from the PyTorch team, particularly Soumith Chintala and Adam Paszke.

Licensing

Unless otherwise stated, the source code is copyright Carnegie Mellon University and licensed under the Apache 2.0 License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].