All Projects → xuyxu → Ensemble Pytorch

xuyxu / Ensemble Pytorch

Licence: bsd-3-clause
A unified ensemble framework for Pytorch to improve the performance and robustness of your deep learning model

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Ensemble Pytorch

Ensemble-Pytorch
A unified ensemble framework for PyTorch to improve the performance and robustness of your deep learning model.
Stars: ✭ 407 (+166.01%)
Mutual labels:  ensemble-learning, deeplearning, pytorch-tutorial
Blinkdl
A minimalist deep learning library in Javascript using WebGL + asm.js. Run convolutional neural network in your browser.
Stars: ✭ 69 (-54.9%)
Mutual labels:  neural-networks, deeplearning
Deeplearning4j
All DeepLearning4j projects go here.
Stars: ✭ 68 (-55.56%)
Mutual labels:  neural-networks, deeplearning
Mit Deep Learning
Tutorials, assignments, and competitions for MIT Deep Learning related courses.
Stars: ✭ 8,912 (+5724.84%)
Mutual labels:  neural-networks, deeplearning
Deep Kernel Gp
Deep Kernel Learning. Gaussian Process Regression where the input is a neural network mapping of x that maximizes the marginal likelihood
Stars: ✭ 58 (-62.09%)
Mutual labels:  neural-networks, deeplearning
Bidaf Keras
Bidirectional Attention Flow for Machine Comprehension implemented in Keras 2
Stars: ✭ 60 (-60.78%)
Mutual labels:  neural-networks, deeplearning
Machine Learning Tutorials
machine learning and deep learning tutorials, articles and other resources
Stars: ✭ 11,692 (+7541.83%)
Mutual labels:  neural-networks, deeplearning
Machine Learning
머신러닝 입문자 혹은 스터디를 준비하시는 분들에게 도움이 되고자 만든 repository입니다. (This repository is intented for helping whom are interested in machine learning study)
Stars: ✭ 705 (+360.78%)
Mutual labels:  pytorch-tutorial, neural-networks
Pytorchnlpbook
Code and data accompanying Natural Language Processing with PyTorch published by O'Reilly Media https://nlproc.info
Stars: ✭ 1,390 (+808.5%)
Mutual labels:  pytorch-tutorial, neural-networks
Ssd Pytorch
SSD: Single Shot MultiBox Detector pytorch implementation focusing on simplicity
Stars: ✭ 107 (-30.07%)
Mutual labels:  neural-networks, deeplearning
Faceswap
Deepfakes Software For All
Stars: ✭ 39,911 (+25985.62%)
Mutual labels:  neural-networks, deeplearning
Artificialintelligenceengines
Computer code collated for use with Artificial Intelligence Engines book by JV Stone
Stars: ✭ 35 (-77.12%)
Mutual labels:  neural-networks, deeplearning
Basic reinforcement learning
An introductory series to Reinforcement Learning (RL) with comprehensive step-by-step tutorials.
Stars: ✭ 826 (+439.87%)
Mutual labels:  neural-networks, deeplearning
Aorun
Deep Learning over PyTorch
Stars: ✭ 61 (-60.13%)
Mutual labels:  neural-networks, deeplearning
Quickdraw
Implementation of Quickdraw - an online game developed by Google
Stars: ✭ 805 (+426.14%)
Mutual labels:  neural-networks, deeplearning
Get started with deep learning for text with allennlp
Getting started with AllenNLP and PyTorch by training a tweet classifier
Stars: ✭ 69 (-54.9%)
Mutual labels:  pytorch-tutorial, neural-networks
Pytorch 101 Tutorial Series
PyTorch 101 series covering everything from the basic building blocks all the way to building custom architectures.
Stars: ✭ 136 (-11.11%)
Mutual labels:  pytorch-tutorial, neural-networks
Learn Data Science For Free
This repositary is a combination of different resources lying scattered all over the internet. The reason for making such an repositary is to combine all the valuable resources in a sequential manner, so that it helps every beginners who are in a search of free and structured learning resource for Data Science. For Constant Updates Follow me in …
Stars: ✭ 4,757 (+3009.15%)
Mutual labels:  neural-networks, deeplearning
Deepfacelab
DeepFaceLab is the leading software for creating deepfakes.
Stars: ✭ 30,308 (+19709.15%)
Mutual labels:  neural-networks, deeplearning
Mit Deep Learning Book Pdf
MIT Deep Learning Book in PDF format (complete and parts) by Ian Goodfellow, Yoshua Bengio and Aaron Courville
Stars: ✭ 9,859 (+6343.79%)
Mutual labels:  neural-networks, deeplearning

.. image:: ./docs/_images/badge_small.png

|github|_ |readthedocs|_ |codecov|_ |python|_ |pypi|_ |license|_

.. |github| image:: https://github.com/xuyxu/Ensemble-Pytorch/workflows/torchensemble-CI/badge.svg .. _github: https://github.com/xuyxu/Ensemble-Pytorch/actions

.. |readthedocs| image:: https://readthedocs.org/projects/ensemble-pytorch/badge/?version=latest .. _readthedocs: https://ensemble-pytorch.readthedocs.io/en/latest/index.html

.. |codecov| image:: https://codecov.io/gh/xuyxu/Ensemble-Pytorch/branch/master/graph/badge.svg?token=2FXCFRIDTV .. _codecov: https://codecov.io/gh/xuyxu/Ensemble-Pytorch

.. |python| image:: https://img.shields.io/badge/python-3.6+-blue?logo=python .. _python: https://www.python.org/

.. |pypi| image:: https://img.shields.io/pypi/v/torchensemble .. _pypi: https://pypi.org/project/torchensemble/

.. |license| image:: https://img.shields.io/github/license/xuyxu/Ensemble-Pytorch .. _license: https://github.com/xuyxu/Ensemble-Pytorch/blob/master/LICENSE

Ensemble PyTorch

Implementation of ensemble methods in PyTorch to improve the performance and robustness of your deep learning model. Please refer to our documentation <https://ensemble-pytorch.readthedocs.io/>__ for details.

Installation

Stable Version


The stable version is available at `PyPI <https://pypi.org/project/torchensemble/>`__. You can install it using:

.. code:: bash

   $ pip install torchensemble

Latest Version

To use the latest version, you need to install the package from source:

.. code:: bash

$ git clone https://github.com/xuyxu/Ensemble-Pytorch.git
$ cd Ensemble-Pytorch
$ pip install -r requirements.txt (Optional)
$ python setup.py install

Minimal Example on How to Use

.. code:: python

from torchensemble import ensemble_method           # import ensemble (e.g., VotingClassifier)

# Load your dataset
train_loader = DataLoader(...)
test_loader = DataLoader(...)

# Define the ensemble
model = ensemble_method(estimator=base_estimator,   # your deep learning model
                        n_estimators=10)            # the number of base estimators

# Set the optimizer
model.set_optimizer("Adam",                         # parameter optimizer
                    lr=learning_rate,               # learning rate of the optimizer
                    weight_decay=weight_decay)      # weight decay of the optimizer

# Train
model.fit(train_loader,
          epochs=epochs)                            # the number of training epochs

# Evaluate
accuracy = model.predict(test_loader)               # evaluate the ensemble

Supported Ensemble

+--------+----------------------+-------------------+ | ID | Ensemble Name | Ensemble Type | +--------+----------------------+-------------------+ | 1 | Fusion | Mixed | +--------+----------------------+-------------------+ | 2 | Voting | Parallel | +--------+----------------------+-------------------+ | 3 | Bagging | Parallel | +--------+----------------------+-------------------+ | 4 | Gradient Boosting | Sequential | +--------+----------------------+-------------------+ | 5 | Snapshot Ensemble | Sequential | +--------+----------------------+-------------------+ | 6 | Adversarial Training | Parallel | +--------+----------------------+-------------------+

Experiment

Classification on CIFAR-10


-  The table below presents the classification accuracy of different
   ensemble classifiers on the testing data of **CIFAR-10**
-  Each classifier uses **10** LeNet-5 model (with RELU activation and
   Dropout) as the base estimators
-  Each base estimator is trained over **100** epochs, with batch size
   **128**, learning rate **1e-3**, and weight decay **5e-4**
-  Experiment results can be reproduced by running
   ``./examples/classification_cifar10_cnn.py``

+----------------------------------+---------------+-------------------+-------------------+
| Model Name                       | Params (MB)   | Testing Acc (%)   | Improvement (%)   |
+==================================+===============+===================+===================+
| **Single LeNet-5**               | 0.32          | 73.04             | ~                 |
+----------------------------------+---------------+-------------------+-------------------+
| **FusionClassifier**             | 3.17          | 78.75             | +5.71             |
+----------------------------------+---------------+-------------------+-------------------+
| **VotingClassifier**             | 3.17          | 80.08             | +7.04             |
+----------------------------------+---------------+-------------------+-------------------+
| **BaggingClassifier**            | 3.17          | 78.75             | +5.71             |
+----------------------------------+---------------+-------------------+-------------------+
| **GradientBoostingClassifier**   | 3.17          | 80.82             | **+7.78**         |
+----------------------------------+---------------+-------------------+-------------------+

Regression on YearPredictionMSD
  • The table below presents the mean squared error (MSE) of different ensemble regressors on the testing data of YearPredictionMSD
  • Each regressor uses 10 multi-layered perceptron (MLP) model (with RELU activation and Dropout) as the base estimators, and the network architecture is fixed as Input-128-128-Output
  • Each base estimator is trained over 50 epochs, with batch size 256, learning rate 1e-3, and weight decay 5e-4
  • Experiment results can be reproduced by running ./examples/regression_YearPredictionMSD_mlp.py

+---------------------------------+---------------+---------------+---------------+ | Model Name | Params (MB) | Testing MSE | Improvement | +=================================+===============+===============+===============+ | Single MLP | 0.11 | 0.83 | ~ | +---------------------------------+---------------+---------------+---------------+ | FusionRegressor | 1.08 | 0.73 | -0.10 | +---------------------------------+---------------+---------------+---------------+ | VotingRegressor | 1.08 | 0.69 | -0.14 | +---------------------------------+---------------+---------------+---------------+ | BaggingRegressor | 1.08 | 0.70 | -0.13 | +---------------------------------+---------------+---------------+---------------+ | GradientBoostingRegressor | 1.08 | 0.71 | -0.12 | +---------------------------------+---------------+---------------+---------------+

Package Dependency

  • joblib>=0.11
  • scikit-learn>=0.23.0
  • torch>=0.4.1
  • torchvision>=0.2.2
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].