All Projects → terarachang → ACCV_TinyGAN

terarachang / ACCV_TinyGAN

Licence: other
BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to ACCV TinyGAN

coursera-gan-specialization
Programming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+346.77%)
Mutual labels:  gans, biggan
Biggan Pytorch
The author's officially unofficial PyTorch BigGAN implementation.
Stars: ✭ 2,459 (+3866.13%)
Mutual labels:  gans, biggan
BigGAN-tensorflow
Reimplementation of the Paper: Large Scale GAN Training for High Fidelity Natural Image Synthesis
Stars: ✭ 27 (-56.45%)
Mutual labels:  biggan
pdf-scripts
📑 Scripts to repair, verify, OCR, compress, wrangle, crop (etc.) PDFs
Stars: ✭ 33 (-46.77%)
Mutual labels:  compress
react-native-compressor
The lightweight library for compress image, video, and audio with an awesome experience
Stars: ✭ 157 (+153.23%)
Mutual labels:  compress
minizip-asm.js
Minizip in javascript. Work with password. Demo:
Stars: ✭ 38 (-38.71%)
Mutual labels:  compress
generative deep learning
Generative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-61.29%)
Mutual labels:  gans
img-master
An image batch processing tool with multifunctional and unlimited
Stars: ✭ 63 (+1.61%)
Mutual labels:  compress
Machine-Learning
The projects I do in Machine Learning with PyTorch, keras, Tensorflow, scikit learn and Python.
Stars: ✭ 54 (-12.9%)
Mutual labels:  gans
cool-papers-in-pytorch
Reimplementing cool papers in PyTorch...
Stars: ✭ 21 (-66.13%)
Mutual labels:  knowledge-distillation
Efficient-Computing
Efficient-Computing
Stars: ✭ 474 (+664.52%)
Mutual labels:  knowledge-distillation
GNNs-in-Network-Neuroscience
A review of papers proposing novel GNN methods with application to brain connectivity published in 2017-2020.
Stars: ✭ 92 (+48.39%)
Mutual labels:  gans
Selfie2Anime-with-TFLite
How to create Selfie2Anime from tflite model to Android.
Stars: ✭ 70 (+12.9%)
Mutual labels:  gans
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-27.42%)
Mutual labels:  knowledge-distillation
FKD
A Fast Knowledge Distillation Framework for Visual Recognition
Stars: ✭ 49 (-20.97%)
Mutual labels:  knowledge-distillation
anime2clothing
Pytorch official implementation of Anime to Real Clothing: Cosplay Costume Generation via Image-to-Image Translation.
Stars: ✭ 65 (+4.84%)
Mutual labels:  gans
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-56.45%)
Mutual labels:  knowledge-distillation
sRender
Facial Sketch Render, ICASSP 2021
Stars: ✭ 20 (-67.74%)
Mutual labels:  gans
SemCKD
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (-32.26%)
Mutual labels:  knowledge-distillation
FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
Stars: ✭ 124 (+100%)
Mutual labels:  knowledge-distillation

TinyGAN

BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression

Python 3.7 PyTorch 1.2.0

This repository contains the official PyTorch implementation of the following paper:

TinyGAN: Distilling BigGAN for Conditional Image Generation (ACCV 2020)
Ting-Yun Chang and Chi-Jen Lu

https://arxiv.org/abs/2009.13829

https://www.youtube.com/watch?v=EsUxQT1su6s

Abstract: Generative Adversarial Networks (GANs) have become a powerful approach for generative image modeling. However, GANs are notorious for their training instability, especially on large-scale, complex datasets. While the recent work of BigGAN has significantly improved the quality of image generation on ImageNet, it requires a huge model, making it hard to deploy on resource-constrained devices. To reduce the model size, we propose a black-box knowledge distillation framework for compressing GANs, which highlights a stable and efficient training process. Given BigGAN as the teacher network, we manage to train a much smaller student network to mimic its functionality, achieving competitive performance on Inception and FID scores but with the generator having 16 times fewer parameters.

The trained model is in gan/models (73 MB) and can be directly downloaded from Github.

Training

$ bash train.sh

Evaluation

$ bash eval.sh

Fig

Fig

Citation

@InProceedings{Chang_2020_ACCV,
    author    = {Chang, Ting-Yun and Lu, Chi-Jen},
    title     = {TinyGAN: Distilling BigGAN for Conditional Image Generation},
    booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)},
    month     = {November},
    year      = {2020}
}

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].