All Projects → thomas-young-2013 → mindware

thomas-young-2013 / mindware

Licence: MIT License
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to mindware

Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+17300%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning, automl, automated-machine-learning
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+31364.71%)
Mutual labels:  hyperparameter-optimization, feature-engineering, bayesian-optimization, automl, automated-machine-learning
Smac3
Sequential Model-based Algorithm Configuration
Stars: ✭ 564 (+1558.82%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning, automl, automated-machine-learning
Hyperactive
A hyperparameter optimization and data collection toolbox for convenient and fast prototyping of machine-learning models.
Stars: ✭ 182 (+435.29%)
Mutual labels:  hyperparameter-optimization, feature-engineering, bayesian-optimization, automated-machine-learning
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (+388.24%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl, automated-machine-learning
Lale
Library for Semi-Automated Data Science
Stars: ✭ 198 (+482.35%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl, automated-machine-learning
Auto ml
[UNMAINTAINED] Automated machine learning for analytics & production
Stars: ✭ 1,559 (+4485.29%)
Mutual labels:  hyperparameter-optimization, feature-engineering, automl, automated-machine-learning
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+1241.18%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, automl, automated-machine-learning
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+11429.41%)
Mutual labels:  hyperparameter-optimization, ensemble-learning, automl, automated-machine-learning
Mljar Supervised
Automated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning 🚀
Stars: ✭ 961 (+2726.47%)
Mutual labels:  hyperparameter-optimization, feature-engineering, automl, automated-machine-learning
Tpot
A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.
Stars: ✭ 8,378 (+24541.18%)
Mutual labels:  hyperparameter-optimization, feature-engineering, automl, automated-machine-learning
Automl alex
State-of-the art Automated Machine Learning python library for Tabular Data
Stars: ✭ 132 (+288.24%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl
syne-tune
Large scale and asynchronous Hyperparameter Optimization at your fingertip.
Stars: ✭ 105 (+208.82%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
ultraopt
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Stars: ✭ 93 (+173.53%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, automl
Milano
Milano is a tool for automating hyper-parameters search for your models on a backend of your choice.
Stars: ✭ 140 (+311.76%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (+550%)
Mutual labels:  hyperparameter-optimization, hyperparameter-tuning, automl
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+608.82%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
EvolutionaryForest
An open source python library for automated feature engineering based on Genetic Programming
Stars: ✭ 56 (+64.71%)
Mutual labels:  feature-engineering, automl, automated-machine-learning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-35.29%)
Mutual labels:  hyperparameter-optimization, bayesian-optimization, hyperparameter-tuning
AutoPrognosis
Codebase for "AutoPrognosis: Automated Clinical Prognostic Modeling via Bayesian Optimization", ICML 2018.
Stars: ✭ 47 (+38.24%)
Mutual labels:  bayesian-optimization, automl, automated-machine-learning

license Build Status Issues Bugs Pull Requests Version Join the chat at https://gitter.im/volcano-ml Documentation Status

MindWare Doc | MindWare中文文档

MindWare: Efficient Open-source AutoML System.

MindWare is an efficient open-source system to help users to automate the process of 1) data pre-processing, 2) feature engineering, 3) algorithm selection, 4) architecture design, 5) hyper-parameter tuning, and 6) model ensembling. It is capable of improving its AutoML power by decomposing the entire large AutoML search space into small ones, and solve each sub-problems jointly and efficiently. MindWare is designed and developed by the AutoML team from the DAIR Lab at Peking University. The goal is to make machine learning easier to apply both in industry and academia, and help facilitate data science.

Who Should Consider MindWare

  • Non-expert users who want to use machine learning techniques in their applications.
  • ML Platform owners who want to support AutoML in their platform.
  • Researchers and data scientists who want to experiment new AutoML algorithms, may it be: hyper-parameter tuning algorithm, neural architect search method, etc.

Design Principles

  • User friendliness. MindWare needs few human assistance. To use MindWare, the users can define the task by writing only a few lines of code, regardless of the techinical details of the execution of the system.
  • High extensibility. New state-of-the-art ML algorithms or feature engineer operations can be added to the system. The decomposition techniques in MindWare ensures the efficiency of finding the best configurations over the enlarged search space.
  • Advanced characteristic. MindWare provides special supports for large datasets. In addition, MindWare enables transfer-learning, meta-learning techniques to make AutoML with more intelligent behaviors.

MindWare Capability in a Glance

Installation

MindWare requires python>=3.6. There are two ways to install MindWare:

Installation via pip

MindWare is available on PyPI. You can install it by tying:

pip install mindware

Manual installation

If you want to try the latest version, please manually install MindWare from source code by:

git clone https://github.com/thomas-young-2013/mindware.git && cd mindware
cat requirements/main.txt | xargs -n 1 -L 1 pip install
python setup.py install --user

For more detailed installation instructions, please refer to our Installation Guide Document.

Example

Here is a brief example that uses the package.

from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from mindware.utils.data_manager import DataManager
from mindware.estimators import Classifier

iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.33, random_state=1, stratify=y)
dm = DataManager(X_train, y_train)
train_data = dm.get_data_node(X_train, y_train)
test_data = dm.get_data_node(X_test, y_test)

clf = Classifier(time_limit=3600)
clf.fit(train_data)

pred = clf.predict(test_data)

For more details and characteristics, please read the documents and examples.

Releases and Contributing

MindWare has a frequent release cycle. Please let us know if you encounter a bug by filling an issue.

We appreciate all contributions. If you are planning to contribute any bug-fixes, please do so without further discussions.

If you plan to contribute new features, new modules, etc. please first open an issue or reuse an existing issue, and discuss the feature with us.

To learn more about making a contribution to MindWare, please refer to our How-to contribution page.

We appreciate all contributions and thank all the contributors!

Feedback

Related Projects

Targeting at openness and advancing state-of-art technology, we have also released several open source projects.

  • OpenBOX: an open source system and service to efficiently solve generalized blackbox optimization problems.

We encourage researchers to leverage the project to accelerate the AI development and research.

Related Publications

VolcanoML: Speeding up End-to-End AutoML via Scalable Search Space Decomposition Yang Li, Yu Shen, Wentao Zhang, Jiawei Jiang, Bolin Ding, Yaliang Li, Jingren Zhou, Zhi Yang, Wentao Wu, Ce Zhang and Bin Cui International Conference on Very Large Data Bases (VLDB 2021). https://arxiv.org/abs/2107.08861

Efficient Automatic CASH via Rising Bandits
Yang Li, Jiawei Jiang, Jinyang Gao, Yingxia Shao, Ce Zhang and Bin Cui Proceedings of the AAAI Conference on Artificial Intelligence (AAAI 2020). https://ojs.aaai.org/index.php/AAAI/article/view/5910

MFES-HB: Efficient Hyperband with Multi-Fidelity Quality Measurements Yang Li, Yu Shen, Jiawei Jiang, Jinyang Gao, Ce Zhang and Bin Cui Proceedings of the AAAI Conference on Artificial Intelligence (AAAI 2021). https://arxiv.org/abs/2012.03011

OpenBox: A Generalized Black-box Optimization Service Yang Li, Yu Shen, Wentao Zhang, Yuanwei Chen, Huaijun Jiang, Mingchao Liu, Jiawei Jiang, Jinyang Gao, Wentao Wu, Zhi Yang, Ce Zhang and Bin Cui ACM SIGKDD Conference on Knowledge Discovery and Data Mining (SIGKDD 2021). https://arxiv.org/abs/2106.00421

License

The entire codebase is under MIT license.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].