All Projects → gmh14 → Robnets

gmh14 / Robnets

Licence: mit
[CVPR 2020] When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Robnets

Autogan
[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
Stars: ✭ 388 (+308.42%)
Mutual labels:  neural-architecture-search
Devol
Genetic neural architecture search with Keras
Stars: ✭ 925 (+873.68%)
Mutual labels:  neural-architecture-search
Awesome Architecture Search
A curated list of awesome architecture search resources
Stars: ✭ 1,078 (+1034.74%)
Mutual labels:  neural-architecture-search
Hpbandster
a distributed Hyperband implementation on Steroids
Stars: ✭ 456 (+380%)
Mutual labels:  neural-architecture-search
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+627.37%)
Mutual labels:  neural-architecture-search
Neural Architecture Search With Rl
Minimal Tensorflow implementation of the paper "Neural Architecture Search With Reinforcement Learning" presented at ICLR 2017
Stars: ✭ 37 (-61.05%)
Mutual labels:  neural-architecture-search
Adanet
Fast and flexible AutoML with learning guarantees.
Stars: ✭ 3,340 (+3415.79%)
Mutual labels:  neural-architecture-search
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+1149.47%)
Mutual labels:  neural-architecture-search
Slimmable networks
Slimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019
Stars: ✭ 708 (+645.26%)
Mutual labels:  neural-architecture-search
Nsganetv2
[ECCV2020] NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search
Stars: ✭ 52 (-45.26%)
Mutual labels:  neural-architecture-search
Awesome Federated Learning
Federated Learning Library: https://fedml.ai
Stars: ✭ 624 (+556.84%)
Mutual labels:  neural-architecture-search
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+612.63%)
Mutual labels:  neural-architecture-search
Efficientnas
Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search https://arxiv.org/abs/1807.06906
Stars: ✭ 44 (-53.68%)
Mutual labels:  neural-architecture-search
Fasterseg
[ICLR 2020] "FasterSeg: Searching for Faster Real-time Semantic Segmentation" by Wuyang Chen, Xinyu Gong, Xianming Liu, Qian Zhang, Yuan Li, Zhangyang Wang
Stars: ✭ 438 (+361.05%)
Mutual labels:  neural-architecture-search
Mtlnas
[CVPR 2020] MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
Stars: ✭ 58 (-38.95%)
Mutual labels:  neural-architecture-search
Neural Architecture Search
Basic implementation of [Neural Architecture Search with Reinforcement Learning](https://arxiv.org/abs/1611.01578).
Stars: ✭ 352 (+270.53%)
Mutual labels:  neural-architecture-search
Morph Net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
Stars: ✭ 937 (+886.32%)
Mutual labels:  neural-architecture-search
Hydra
Multi-Task Learning Framework on PyTorch. State-of-the-art methods are implemented to effectively train models on multiple tasks.
Stars: ✭ 87 (-8.42%)
Mutual labels:  neural-architecture-search
Tenas
[ICLR 2021] "Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective" by Wuyang Chen, Xinyu Gong, Zhangyang Wang
Stars: ✭ 63 (-33.68%)
Mutual labels:  neural-architecture-search
Autokeras
AutoML library for deep learning
Stars: ✭ 8,269 (+8604.21%)
Mutual labels:  neural-architecture-search

When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks

This repository contains the implementation code for paper When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks (CVPR 2020). Also check out the project page.

In this work, we take an architectural perspective and investigate the patterns of network architectures that are resilient to adversarial attacks. We discover a family of robust architectures (RobNets), which exhibit superior robustness performance to other widely used architectures.

overview

Installation

Prerequisites

  • Data: Download the CIFAR10, SVHN and ImageNet dataset and move the test/validation set to the folder data/.

  • Model: Download the pre-trained models and unzip to the folder checkpoint/.

Dependencies for RobNets

You can install the dependencies for RobNets using

pip install -r requirements.txt

Experiments

Test

All the configurations of the experiments are provided in folders experiments/*/config.py, including different datasets and RobNet architectures. You can directly modify them to suit your demand.

To conduct a specific experiment, e.g. RobNet_free for CIFAR10, run

python main.py --config='./experiments/RobNet_free_cifar10/config.py' --eval_only

With the flag eval_only, you can test the results for all the experiments in experiments.

Train (NEW)

We also provide the training interface of RobNets. For now, only training on CIFAR10 is provided. Training on ImageNet is WIP.

We use Pytorch distributed training with slurm and nccl backend. You can conduct the training for RobNet_large on CIFAR10 by running

GPUS_PER_NODE=8 GPUS=32 bash slurm_train.sh **PartitionName** './experiments/RobNet_large_cifar10/config.py'

RobNet_free_cifar10 and RobNet_large_v1_cifar10 in checkpoint/ are obtained with a total training batch size 1536, while RobNet_large_v2_cifar10 with batch size 1024. Make sure to linearly scale the learning rate if you have a different batch size. (In fact, the hyper-parameters here are not optimized sufficiently by trial and error. If you find a better combination, welcome to deliver PR!)

Note: You may notice that some of the training configurations are slightly different from the original paper, such as the learning rate scheduler. However, the training configurations in this repo can yeild even better results than those in the paper. Check the training log of RobNet_large_v1_cifar10 here using the script in this repo. You can try a test using this checkpoint and will get ~83.5% clean accuracy and ~52.1% adversarial accuracy under PGD-20 attack!

Use RobNet Architectures

To use the searched RobNet models, for example, load RobNet_free on CIFAR10:

import models
import architecture_code
import utils

# use RobNet architecture
net = models.robnet(architecture_code.robnet_free)
net = net.cuda()
# load pre-trained model
utils.load_state('./checkpoint/RobNet_free_cifar10.pth.tar', net)

For other models, the loading process is similar, just copy the corresponding parameters (you can find in the variable model_param in each experiments/*/config.py) to the function models.robnet().

Acknowledgements

The implementation of RobNets is partly based on this work.

Citation

If you find the idea or code useful for your research, please cite our paper:

@article{guo2019meets,
  title={When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks},
  author={Guo, Minghao and Yang, Yuzhe and Xu, Rui and Liu, Ziwei and Lin, Dahua},
  journal={arXiv preprint arXiv:1911.10695},
  year={2019}
}

Contact

Please contact [email protected] and [email protected] if you have any questions. Enjoy!

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].