All Projects β†’ ytopt-team β†’ ytopt

ytopt-team / ytopt

Licence: BSD-2-Clause License
ytopt: machine-learning-based search methods for autotuning

Programming Languages

c
50402 projects - #5 most used programming language
SWIG
194 projects
python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
shell
77523 projects
Makefile
30231 projects

Projects that are alternatives of or similar to ytopt

deodorant
Deodorant: Solving the problems of Bayesian Optimization
Stars: ✭ 15 (-11.76%)
Mutual labels:  bayesian-optimization
paper-synthesizing-benchmarks
πŸ“ "Synthesizing Benchmarks for Predictive Modeling" (πŸ₯‡ CGO'17 Best Paper)
Stars: ✭ 21 (+23.53%)
Mutual labels:  autotuning
phoenics
Phoenics: Bayesian optimization for efficient experiment planning
Stars: ✭ 68 (+300%)
Mutual labels:  bayesian-optimization
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. ζ―”HyperOptζ›΄εΌΊηš„εˆ†εΈƒεΌεΌ‚ζ­₯θΆ…ε‚δΌ˜εŒ–εΊ“γ€‚
Stars: ✭ 93 (+447.06%)
Mutual labels:  bayesian-optimization
xgboost-lightgbm-hyperparameter-tuning
Bayesian Optimization and Grid Search for xgboost/lightgbm
Stars: ✭ 40 (+135.29%)
Mutual labels:  bayesian-optimization
bayex
Bayesian Optimization in JAX
Stars: ✭ 24 (+41.18%)
Mutual labels:  bayesian-optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (+100%)
Mutual labels:  bayesian-optimization
sequential-gallery
Sequential Gallery for Interactive Visual Design Optimization [SIGGRAPH 2020]
Stars: ✭ 15 (-11.76%)
Mutual labels:  bayesian-optimization
hyper-engine
Python library for Bayesian hyper-parameters optimization
Stars: ✭ 80 (+370.59%)
Mutual labels:  bayesian-optimization
open-box
Generalized and Efficient Blackbox Optimization System [SIGKDD'21].
Stars: ✭ 174 (+923.53%)
Mutual labels:  bayesian-optimization
FLEXS
Fitness landscape exploration sandbox for biological sequence design.
Stars: ✭ 92 (+441.18%)
Mutual labels:  bayesian-optimization
Hyperopt.jl
Hyperparameter optimization in Julia.
Stars: ✭ 144 (+747.06%)
Mutual labels:  bayesian-optimization
halo
πŸ˜‡ Wholly Adaptive LLVM Optimizer
Stars: ✭ 22 (+29.41%)
Mutual labels:  autotuning
GPim
Gaussian processes and Bayesian optimization for images and hyperspectral data
Stars: ✭ 29 (+70.59%)
Mutual labels:  bayesian-optimization
AutoOED
AutoOED: Automated Optimal Experimental Design Platform
Stars: ✭ 87 (+411.76%)
Mutual labels:  bayesian-optimization
keras gpyopt
Using Bayesian Optimization to optimize hyper parameter in Keras-made neural network model.
Stars: ✭ 56 (+229.41%)
Mutual labels:  bayesian-optimization
CamelsOptimizer
Yes, it's a camel case.
Stars: ✭ 17 (+0%)
Mutual labels:  bayesian-optimization
Multiobjective EGO algorithms
The standard and parallel multiobjective EGO algorithms
Stars: ✭ 22 (+29.41%)
Mutual labels:  bayesian-optimization
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (+100%)
Mutual labels:  bayesian-optimization
SG MCMC
Implementation of Stochastic Gradient MCMC algorithms
Stars: ✭ 37 (+117.65%)
Mutual labels:  bayesian-optimization

What is ytopt?

ytopt is a machine-learning-based search software package that consists of sampling a small number of input parameter configurations, evaluating them, and progressively fitting a surrogate model over the input-output space until exhausting the user-defined time or the maximum number of evaluations. The package is built based on Bayesian Optimization that solves any optimization problem and is especially useful when the objective function is difficult to evaluate. It provides an interface that deals with unconstrained and constrained problems. The software is designed to operate in the manager-worker computational paradigm, where one manager node fits the surrogate model and generates promising input configurations and worker nodes perform the computationally expensive evaluations and return the outputs to the manager node. The asynchronous aspect of the search allows the search to avoid waiting for all the evaluation results before proceeding to the next iteration. As soon as an evaluation is finished, the data is used to retrain the surrogate model, which is then used to bias the search toward the promising configurations.

Directory structure

docs/	
    Sphinx documentation files
test/
    scipts for running benchmark problems in the problems directory
ytopt/	
    scripts that contain the search implementations  
ytopt/benchmark/	
    a set of problems the user can use to compare our different search algorithms or as examples to build their own problems

Install instructions

The autotuning framework requires the following components: ConfigSpace, CConfigSpace (optional), scikit-optimize, autotune, and ytopt.

  • We recommend creating isolated Python environments on your local machine using conda, for example:
conda create --name ytune python=3.7
conda activate ytune
  • Create a directory for ytopt tutorial as follows:
mkdir ytopt
cd ytopt
git clone https://github.com/ytopt-team/ConfigSpace.git
cd ConfigSpace
pip install -e .
cd ..
git clone https://github.com/ytopt-team/scikit-optimize.git
cd scikit-optimize
pip install -e .
cd ..
git clone -b version1 https://github.com/ytopt-team/autotune.git
cd autotune
pip install -e . 
cd ..
git clone -b main https://github.com/ytopt-team/ytopt.git
cd ytopt
pip install -e .
  • If you encounter installtion error, install psutil, setproctitle, mpich, mpi4py first as follows:
conda install -c conda-forge psutil
conda install -c conda-forge setproctitle
conda install -c conda-forge mpich
conda install -c conda-forge mpi4py
pip install -e .
  • [Optinal] Install CConfigSpace:
    • Prerequisites: autotools and the gsl
      • Ubuntu

        sudo apt-get install autoconf automake libtool libgsl-dev
        
      • MacOS

        brew install autoconf automake libtool gsl
        
    • Build and Install the library and python bindings: the configure command can take an optional --prefix= parameter to specify a different install path than the default one (/usr/local). Depending on the chosen location you may need elevated previleges to run make install.
      git clone [email protected]:argonne-lcf/CCS.git
      cd CCS
      ./autogen.sh
      mkdir build
      cd build
      ../configure
      make
      make install
      cd ../bindings/python
      pip install parglare==0.12.0
      pip install -e .
      
    • Setup environment: in order for the python binding to find the CConfigSpace library, the path to the library install location (/usr/local/lib by default) must be appended to the LD_LIBRARY_PATH environment variable on Linux, while on MacOS the DYLD_LIBRARY_PATH environment variable serves the same purpose. Alternatively the LIBCCONFIGSPACE_SO_ environment variable can be made to point to the installed libcconfigspace.so file on Linux or to the installed libcconfigspace.dylib on MacOS.

Tutorials

Who is responsible?

The core ytopt team is at Argonne National Laboratory:

The convolution-2d tutorial (source and python scripts) is contributed by:

Publications

  • J. Koo, P. Balaprakash, M. Kruse, X. Wu, P. Hovland, and M. Hall, "Customized Monte Carlo Tree Search for LLVM/Polly's Composable Loop Optimization Transformations," in Proceedings of 12th IEEE International Workshop on Performance Modeling, Benchmarking and Simulation of High Performance Computer Systems (PMBS21), pages 82–93, 2021. DOI: 10.1109/PMBS54543.2021.00015
  • X. Wu, M. Kruse, P. Balaprakash, H. Finkel, P. Hovland, V. Taylor, and M. Hall, "Autotuning PolyBench benchmarks with LLVM Clang/Polly loop optimization pragmas using Bayesian optimization (extended version)," Concurrency and Computation. Practice and Experience, vol. 11, 2021. ISSN 1532-0626 DOI: 10.1002/cpe.6683
  • X. Wu, M. Kruse, P. Balaprakash, H. Finkel, P. Hovland, V. Taylor, and M. Hall, "Autotuning PolyBench Benchmarks with LLVM Clang/Polly Loop Optimization Pragmas Using Bayesian Optimization," in Proceedings of 11th IEEE International Workshop on Performance Modeling, Benchmarking and Simulation of High Performance Computer Systems (PMBS20), pages 61–70, 2020. DOI: 10.1109/PMBS51919.2020.00012
  • P. Balaprakash, J. Dongarra, T. Gamblin, M. Hall, J. K. Hollingsworth, B. Norris, and R. Vuduc, "Autotuning in High-Performance Computing Applications," Proceedings of the IEEE, vol. 106, no. 11, 2018. DOI: 10.1109/JPROC.2018.2841200
  • T. Nelson, A. Rivera, P. Balaprakash, M. Hall, P. Hovland, E. Jessup, and B. Norris, "Generating efficient tensor contractions for GPUs," in Proceedings of 44th International Conference on Parallel Processing, pages 969–978, 2015. DOI: 10.1109/ICPP.2015.106

Acknowledgements

  • YTune: Autotuning Compiler Technology for Cross-Architecture Transformation and Code Generation, U.S. Department of Energy Exascale Computing Project (2017--Present)
  • Scalable Data-Efficient Learning for Scientific Domains, U.S. Department of Energy 2018 Early Career Award funded by the Advanced Scientific Computing Research program within the DOE Office of Science (2018--Present)
  • PROTEAS-TUNE, U.S. Department of Energy ASCR Exascale Computing Project (2018--Present)

Copyright and license

TBD

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].