All Projects → ivclab → NeuralMerger

ivclab / NeuralMerger

Licence: MIT license
Yi-Min Chou, Yi-Ming Chan, Jia-Hong Lee, Chih-Yi Chiu, Chu-Song Chen, "Unifying and Merging Well-trained Deep Neural Networks for Inference Stage," International Joint Conference on Artificial Intelligence (IJCAI), 2018

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to NeuralMerger

agegenderLMTCNN
Jia-Hong Lee, Yi-Ming Chan, Ting-Yen Chen, and Chu-Song Chen, "Joint Estimation of Age and Gender from Unconstrained Face Images using Lightweight Multi-task CNN for Mobile Applications," IEEE International Conference on Multimedia Information Processing and Retrieval, MIPR 2018
Stars: ✭ 39 (+95%)
Mutual labels:  multi-task-learning, efficient-inference
HyperFace-TensorFlow-implementation
HyperFace
Stars: ✭ 68 (+240%)
Mutual labels:  multi-task-learning
Pytorch-PCGrad
Pytorch reimplementation for "Gradient Surgery for Multi-Task Learning"
Stars: ✭ 179 (+795%)
Mutual labels:  multi-task-learning
MNIST-multitask
6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"
Stars: ✭ 34 (+70%)
Mutual labels:  multi-task-learning
temporal-depth-segmentation
Source code (train/test) accompanying the paper entitled "Veritatem Dies Aperit - Temporally Consistent Depth Prediction Enabled by a Multi-Task Geometric and Semantic Scene Understanding Approach" in CVPR 2019 (https://arxiv.org/abs/1903.10764).
Stars: ✭ 20 (+0%)
Mutual labels:  multi-task-learning
mtlearn
Multi-Task Learning package built with tensorflow 2 (Multi-Gate Mixture of Experts, Cross-Stitch, Ucertainty Weighting)
Stars: ✭ 45 (+125%)
Mutual labels:  multi-task-learning
FOCAL-ICLR
Code for FOCAL Paper Published at ICLR 2021
Stars: ✭ 35 (+75%)
Mutual labels:  multi-task-learning
Mt Dnn
Multi-Task Deep Neural Networks for Natural Language Understanding
Stars: ✭ 1,871 (+9255%)
Mutual labels:  multi-task-learning
CPG
Steven C. Y. Hung, Cheng-Hao Tu, Cheng-En Wu, Chien-Hung Chen, Yi-Ming Chan, and Chu-Song Chen, "Compacting, Picking and Growing for Unforgetting Continual Learning," Thirty-third Conference on Neural Information Processing Systems, NeurIPS 2019
Stars: ✭ 91 (+355%)
Mutual labels:  multi-task-learning
torchMTL
A lightweight module for Multi-Task Learning in pytorch.
Stars: ✭ 84 (+320%)
Mutual labels:  multi-task-learning
EasyRec
A framework for large scale recommendation algorithms.
Stars: ✭ 599 (+2895%)
Mutual labels:  multi-task-learning
multi-task-learning
Multi-task learning smile detection, age and gender classification on GENKI4k, IMDB-Wiki dataset.
Stars: ✭ 154 (+670%)
Mutual labels:  multi-task-learning
Fine-Grained-or-Not
Code release for Your “Flamingo” is My “Bird”: Fine-Grained, or Not (CVPR 2021 Oral)
Stars: ✭ 32 (+60%)
Mutual labels:  multi-task-learning
Multi-task-Conditional-Attention-Networks
A prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (+5%)
Mutual labels:  multi-task-learning
emmental
A deep learning framework for building multimodal multi-task learning systems.
Stars: ✭ 93 (+365%)
Mutual labels:  multi-task-learning
cups-rl
Customisable Unified Physical Simulations (CUPS) for Reinforcement Learning. Experiments run on the ai2thor environment (http://ai2thor.allenai.org/) e.g. using A3C, RainbowDQN and A3C_GA (Gated Attention multi-modal fusion) for Task-Oriented Language Grounding (tasks specified by natural language instructions) e.g. "Pick up the Cup or else"
Stars: ✭ 38 (+90%)
Mutual labels:  multi-task-learning
DeepSegmentor
A Pytorch implementation of DeepCrack and RoadNet projects.
Stars: ✭ 152 (+660%)
Mutual labels:  multi-task-learning
PCC-Net
PCC Net: Perspective Crowd Counting via Spatial Convolutional Network
Stars: ✭ 63 (+215%)
Mutual labels:  multi-task-learning
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (+920%)
Mutual labels:  efficient-inference
OmiEmbed
Multi-task deep learning framework for multi-omics data analysis
Stars: ✭ 16 (-20%)
Mutual labels:  multi-task-learning

NeuralMerger

Official implementation of Unifying and Merging Well-trained Deep Neural Networks for Inference Stage.

Created by Yi-Min Chou , Yi-Ming Chan, Jia-Hong Lee, Chih-Yi Chiu, Chu-Song Chen

Usage

Fine-tuning: Finetune the merged model of two well-trained neural networks (Tensorflow implementation).

Inference: Test the speed of the merged model (C implementation).

NeuralMerger
    ├─────── Fine-tuning
    └─────── Inference

1.Clone the NeuralMerger repository:

$ git clone --recursive https://github.com/ivclab/NeuralMerger.git

2.Follow the instruction in Fine-tuning and get the well-trained merged model.

3.Test the well-trained merged model on Inference.

Citation

Please cite following paper if these codes help your research:

@inproceedings{chou2018unifying,
  title={Unifying and merging well-trained deep neural networks for inference stage},
  author={Chou, Yi-Min and Chan, Yi-Ming and Lee, Jia-Hong and Chiu, Chih-Yi and Chen, Chu-Song},
  booktitle={Proceedings of the 27th International Joint Conference on Artificial Intelligence},
  pages={2049--2056},
  year={2018},
  organization={AAAI Press}
}

@inproceedings{chou2018merging,
  title={Merging Deep Neural Networks for Mobile Devices},
  author={Chou, Yi-Min and Chan, Yi-Ming and Lee, Jia-Hong and Chiu, Chih-Yi and Chen, Chu-Song},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops},
  pages={1686--1694},
  year={2018}
}   

Contact

Please feel free to leave suggestions or comments to Yi-Min Chou([email protected]) , Yi-Ming Chan([email protected]), Jia-Hong Lee([email protected]), Chih-Yi Chiu([email protected]), Chu-Song Chen([email protected])

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].