All Projects → zeke-xie → artificial-neural-variability-for-deep-learning

zeke-xie / artificial-neural-variability-for-deep-learning

Licence: MIT license
The PyTorch Implementation of Variable Optimizers/ Neural Variable Risk Minimization proposed in our Neural Computation paper: Artificial Neural Variability for Deep Learning: On overfitting, Noise Memorization, and Catastrophic Forgetting.

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to artificial-neural-variability-for-deep-learning

Brainflow
BrainFlow is a library intended to obtain, parse and analyze EEG, EMG, ECG and other kinds of data from biosensors
Stars: ✭ 170 (+400%)
Mutual labels:  neuroscience
Neurodocker
Generate custom Docker and Singularity images, and minimize existing containers
Stars: ✭ 198 (+482.35%)
Mutual labels:  neuroscience
Open Computational Neuroscience Resources
A publicly-editable collection of open computational neuroscience resources
Stars: ✭ 234 (+588.24%)
Mutual labels:  neuroscience
Visbrain
A multi-purpose GPU-accelerated open-source suite for brain data visualization
Stars: ✭ 172 (+405.88%)
Mutual labels:  neuroscience
Brainrender
a python based software for visualization of neuroanatomical and morphological data.
Stars: ✭ 183 (+438.24%)
Mutual labels:  neuroscience
Moabb
Mother of All BCI Benchmarks
Stars: ✭ 214 (+529.41%)
Mutual labels:  neuroscience
Fooof
Parameterizing neural power spectra into periodic & aperiodic components.
Stars: ✭ 162 (+376.47%)
Mutual labels:  neuroscience
neth-proxy
Stratum <-> Stratum Proxy and optimizer for ethminer
Stars: ✭ 35 (+2.94%)
Mutual labels:  optimizer
Neurolib
Easy whole-brain modeling for computational neuroscientists 🧠💻👩🏿‍🔬
Stars: ✭ 188 (+452.94%)
Mutual labels:  neuroscience
Brainiak
Brain Imaging Analysis Kit
Stars: ✭ 232 (+582.35%)
Mutual labels:  neuroscience
Opensesame
Graphical experiment builder for the social sciences
Stars: ✭ 173 (+408.82%)
Mutual labels:  neuroscience
Bmtk
Brain Modeling Toolkit
Stars: ✭ 177 (+420.59%)
Mutual labels:  neuroscience
Awesome Computational Neuroscience
A list of schools and researchers in computational neuroscience
Stars: ✭ 230 (+576.47%)
Mutual labels:  neuroscience
Eegrunt
A Collection Python EEG (+ ECG) Analysis Utilities for OpenBCI and Muse
Stars: ✭ 171 (+402.94%)
Mutual labels:  neuroscience
Pyphi
A toolbox for integrated information theory.
Stars: ✭ 246 (+623.53%)
Mutual labels:  neuroscience
Erplab
ERPLAB Toolbox is a free, open-source Matlab package for analyzing ERP data. It is tightly integrated with EEGLAB Toolbox, extending EEGLAB’s capabilities to provide robust, industrial-strength tools for ERP processing, visualization, and analysis. A graphical user interface makes it easy for beginners to learn, and Matlab scripting provides enormous power for intermediate and advanced users.
Stars: ✭ 166 (+388.24%)
Mutual labels:  neuroscience
Neurotech Course
CS198-96: Intro to Neurotechnology @ UC Berkeley
Stars: ✭ 202 (+494.12%)
Mutual labels:  neuroscience
AshBF
Over-engineered Brainfuck optimizing compiler and interpreter
Stars: ✭ 14 (-58.82%)
Mutual labels:  optimizer
twpca
🕝 Time-warped principal components analysis (twPCA)
Stars: ✭ 118 (+247.06%)
Mutual labels:  neuroscience
Brayns
Visualizer for large-scale and interactive ray-tracing of neurons
Stars: ✭ 232 (+582.35%)
Mutual labels:  neuroscience

artificial-neural-variability-for-deep-learning

The Pytorch Implementation of Variable Optimizers/ Neural Variable Risk Minimization.

The algortihms are proposed in our paper: Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting, which will appear in Neural Computation.

Why Artificial Neural Variability?

We introduce a neuroscience concept, called neural variability, into deep learning.

It helps DNNs learn from neuroscience.

At negligible computational and coding costs, our neuroscience-inspired optimization method can

(1) enhance the robustness to weight perturbation;

(2) improve generalizability;

(3) relieve the memorization of noisy labels;

(4) mitigate catastrophic forgetting.

How good is Artificial Neural Variability?

The learning curves of ResNet-34 on CIFAR-10 with 40% asymmetric label noise. NVRM prevents overitting noisy labels effectively, while SGD almost memorizes all noisy labels. Figure 1. The learning curves of ResNet-34 on CIFAR-10 with 40% asymmetric label noise. NVRM prevents overitting noisy labels effectively, while SGD almost memorizes all noisy labels.

The environment is as bellow:

Ubuntu 18.04.4 LTS

Python >= 3.7.3

PyTorch >= 1.4.0

Code Example:

You may use it as a standard Pytorch optimizer.

from variable_optim import VSGD

optimizer = VSGD(net.parameters(), lr=lr, variability=variability, num_iters=num_iters, weight_decay=weight_decay)

Citing

If you use artificieal neural variabiliy / NVRM in your work, please cite

@article{xie2021artificial,
  title={Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting},
  author={Xie, Zeke and He, Fengxiang and Fu, Shaopeng and Sato, Issei and Tao, Dacheng and Sugiyama, Masashi},
  journal={Neural Computation},
  year={2021}
  volume={33},
  number={8},
  pages = {2163-2192},
  publisher={MIT Press One Rogers Street, Cambridge, MA 02142-1209, USA journals-info~…}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].