All Projects → ypeleg → Hungabunga

ypeleg / Hungabunga

Licence: mit
HungaBunga: Brute-Force all sklearn models with all parameters using .fit .predict!

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Hungabunga

Machinejs
[UNMAINTAINED] Automated machine learning- just give it a data file! Check out the production-ready version of this project at ClimbsRocks/auto_ml
Stars: ✭ 412 (-32.9%)
Mutual labels:  kaggle, scikit-learn, automl
Sklearn Classification
Data Science Notebook on a Classification Task, using sklearn and Tensorflow.
Stars: ✭ 518 (-15.64%)
Mutual labels:  learning, sklearn, machine
Igel
a delightful machine learning tool that allows you to train, test, and use models without writing code
Stars: ✭ 2,956 (+381.43%)
Mutual labels:  scikit-learn, automl, sklearn
Amazon Sagemaker Examples
Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker.
Stars: ✭ 6,346 (+933.55%)
Mutual labels:  learning, machine
lobe
Lobe is the world's first AI paralegal.
Stars: ✭ 22 (-96.42%)
Mutual labels:  learning, machine
sklearn-feature-engineering
使用sklearn做特征工程
Stars: ✭ 114 (-81.43%)
Mutual labels:  sklearn, kaggle
Kaio-machine-learning-human-face-detection
Machine Learning project a case study focused on the interaction with digital characters, using a character called "Kaio", which, based on the automatic detection of facial expressions and classification of emotions, interacts with humans by classifying emotions and imitating expressions
Stars: ✭ 18 (-97.07%)
Mutual labels:  scikit-learn, sklearn
Sklearn Evaluation
Machine learning model evaluation made easy: plots, tables, HTML reports, experiment tracking and Jupyter notebook analysis.
Stars: ✭ 294 (-52.12%)
Mutual labels:  scikit-learn, sklearn
Sharplearning
Machine learning for C# .Net
Stars: ✭ 294 (-52.12%)
Mutual labels:  learning, machine
Autoviz
Automatically Visualize any dataset, any size with a single line of code. Created by Ram Seshadri. Collaborators Welcome. Permission Granted upon Request.
Stars: ✭ 310 (-49.51%)
Mutual labels:  scikit-learn, automl
Data Science Ipython Notebooks
Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.
Stars: ✭ 22,048 (+3490.88%)
Mutual labels:  kaggle, scikit-learn
Moviebox
Machine learning movie recommending system
Stars: ✭ 504 (-17.92%)
Mutual labels:  learning, machine
Recipe
Automated machine learning (AutoML) with grammar-based genetic programming
Stars: ✭ 42 (-93.16%)
Mutual labels:  scikit-learn, automl
HumanOrRobot
a solution for competition of kaggle `Human or Robot`
Stars: ✭ 16 (-97.39%)
Mutual labels:  sklearn, kaggle
python3-docker-devenv
Docker Start Guide with Python Development Environment
Stars: ✭ 13 (-97.88%)
Mutual labels:  scikit-learn, sklearn
skippa
SciKIt-learn Pipeline in PAndas
Stars: ✭ 33 (-94.63%)
Mutual labels:  scikit-learn, sklearn
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+538.44%)
Mutual labels:  scikit-learn, automl
codeflare
Simplifying the definition and execution, scaling and deployment of pipelines on the cloud.
Stars: ✭ 163 (-73.45%)
Mutual labels:  sklearn, automl
simple-image-classifier
Simple image classifier microservice using tensorflow and sanic
Stars: ✭ 22 (-96.42%)
Mutual labels:  learning, machine
Profanity Check
A fast, robust Python library to check for offensive language in strings.
Stars: ✭ 354 (-42.35%)
Mutual labels:  scikit-learn, sklearn

Hunga-Bunga

Brute Force all scikit-learn models and all scikit-learn parameters with fit predict.


Lets brute force all sklearn models with all of sklearn parameters! Ahhh Hunga Bunga!!
from hunga_bunga import HungaBungaClassifier, HungaBungaRegressor
And then simply:


What?

Yes.

No! Really! What?

Many believe that

most of the work of supervised (non-deep) Machine Learning lies in feature engineering, whereas the model-selection process is just running through all the models or just take xgboost.

So here is an automation for that.

HOW IT WORKS

Runs through all sklearn models (both classification and regression), with all possible hyperparameters, and rank using cross-validation.

MODELS

Runs all the model available on sklearn for supervised learning here. The categories are:

  • Generalized Linear Models
  • Kernel Ridge
  • Support Vector Machines
  • Nearest Neighbors
  • Gaussian Processes
  • Naive Bayes
  • Trees
  • Neural Networks
  • Ensemble methods

Note: Some models were dropped out (nearly none of them..) and some crash or cause exceptions from time to time. It takes REALLY long to test this out so clearing exceptions took me a while.

Installation

pip install hunga-bunga

Dependencies


- Python (>= 2.7)
- NumPy (>= 1.11.0)
- SciPy (>= 0.17.0)
- joblib (>= 0.11)
- scikit-learn (>=0.20.0)
- tabulate (>=0.8.2)
- tqdm (>=4.28.1)

Option I (Recommended): brain = False

As any other sklearn model

clf = HungaBungaClassifier()
clf.fit(x, y)
clf.predict(x)

And import from here

from hunga_bunga import HungaBungaClassifier, HungaBungaRegressor

Option II: brain = True

As any other sklearn model

clf = HungaBungaClassifier(brain=True)
clf.fit(x, y)

The output looks this:

Model accuracy Time/clf (s)
SGDClassifier 0.967 0.001
LogisticRegression 0.940 0.001
Perceptron 0.900 0.001
PassiveAggressiveClassifier 0.967 0.001
MLPClassifier 0.827 0.018
KMeans 0.580 0.010
KNeighborsClassifier 0.960 0.000
NearestCentroid 0.933 0.000
RadiusNeighborsClassifier 0.927 0.000
SVC 0.960 0.000
NuSVC 0.980 0.001
LinearSVC 0.940 0.005
RandomForestClassifier 0.980 0.015
DecisionTreeClassifier 0.960 0.000
ExtraTreesClassifier 0.993 0.002

The winner is: ExtraTreesClassifier with score 0.993.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].