All Projects → AvivNavon → AuxiLearn

AvivNavon / AuxiLearn

Licence: MIT license
Official implementation of Auxiliary Learning by Implicit Differentiation [ICLR 2021]

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to AuxiLearn

GEANet-BioMed-Event-Extraction
Code for the paper Biomedical Event Extraction with Hierarchical Knowledge Graphs
Stars: ✭ 52 (-26.76%)
Mutual labels:  multitask-learning
modular semantic segmentation
Corresponding implementations for the IROS 2018 paper "Modular Sensor Fusion for Semantic Segmentation"
Stars: ✭ 24 (-66.2%)
Mutual labels:  semantic-segmentation
IAST-ECCV2020
IAST: Instance Adaptive Self-training for Unsupervised Domain Adaptation (ECCV 2020) https://teacher.bupt.edu.cn/zhuchuang/en/index.htm
Stars: ✭ 84 (+18.31%)
Mutual labels:  semantic-segmentation
pytorch-UNet
2D and 3D UNet implementation in PyTorch.
Stars: ✭ 107 (+50.7%)
Mutual labels:  semantic-segmentation
multichannel-semseg-with-uda
Multichannel Semantic Segmentation with Unsupervised Domain Adaptation
Stars: ✭ 19 (-73.24%)
Mutual labels:  semantic-segmentation
DocuNet
Code and dataset for the IJCAI 2021 paper "Document-level Relation Extraction as Semantic Segmentation".
Stars: ✭ 84 (+18.31%)
Mutual labels:  semantic-segmentation
Remote-sensing-image-semantic-segmentation-tf2
The remote sensing image semantic segmentation repository based on tf.keras includes backbone networks such as resnet, densenet, mobilenet, and segmentation networks such as deeplabv3+, pspnet, panet, and refinenet.
Stars: ✭ 54 (-23.94%)
Mutual labels:  semantic-segmentation
dnn.cool
A framework for multi-task learning, where you may precondition tasks and compose them into bigger tasks. Conditional objectives and per-task evaluations and interpretations.
Stars: ✭ 44 (-38.03%)
Mutual labels:  multitask-learning
MIRT.jl
MIRT: Michigan Image Reconstruction Toolbox (Julia version)
Stars: ✭ 80 (+12.68%)
Mutual labels:  optimization-algorithms
kits19-challenge
Kidney Tumor Segmentation Challenge 2019
Stars: ✭ 44 (-38.03%)
Mutual labels:  semantic-segmentation
FoggySynscapes
Semantic Understanding of Foggy Scenes with Purely Synthetic Data
Stars: ✭ 37 (-47.89%)
Mutual labels:  semantic-segmentation
Adversarial-Semisupervised-Semantic-Segmentation
Pytorch Implementation of "Adversarial Learning For Semi-Supervised Semantic Segmentation" for ICLR 2018 Reproducibility Challenge
Stars: ✭ 151 (+112.68%)
Mutual labels:  semantic-segmentation
Baidu Lane Segmentation
4th place solution in Baidu Autonomous Driving Lane Segmentation
Stars: ✭ 19 (-73.24%)
Mutual labels:  semantic-segmentation
Paper-Notes
Paper notes in deep learning/machine learning and computer vision
Stars: ✭ 37 (-47.89%)
Mutual labels:  semantic-segmentation
food-detection-yolov5
🍔🍟🍗 Food analysis baseline with Theseus. Integrate object detection, image classification and multi-class semantic segmentation. 🍞🍖🍕
Stars: ✭ 68 (-4.23%)
Mutual labels:  semantic-segmentation
caffe
Caffe: a fast open framework for deep learning.
Stars: ✭ 4,618 (+6404.23%)
Mutual labels:  semantic-segmentation
awesome-computer-vision-models
A list of popular deep learning models related to classification, segmentation and detection problems
Stars: ✭ 419 (+490.14%)
Mutual labels:  semantic-segmentation
optaplanner-quickstarts
OptaPlanner quick starts for AI optimization: many use cases shown in many different technologies.
Stars: ✭ 226 (+218.31%)
Mutual labels:  optimization-algorithms
EDANet
Implementation details for EDANet
Stars: ✭ 34 (-52.11%)
Mutual labels:  semantic-segmentation
SyConn
Toolkit for the generation and analysis of volume eletron microscopy based synaptic connectomes of brain tissue.
Stars: ✭ 31 (-56.34%)
Mutual labels:  semantic-segmentation

AuxiLearn - Auxiliary Learning by Implicit Differentiation

This repository contains the source code to support the paper Auxiliary Learning by Implicit Differentiation, by Aviv Navon*, Idan Achituve*, Haggai Maron, Gal Chechik and Ethan Fetaya, ICLR 2021.


Links

  1. Paper
  2. Project page

Installation

Please note: We encountered some issues and drops in performance while working with different PyTorch versions. Please install AuxiLearn on a clean virtual environment!

python3 -m venv <venv>
source <venv>/bin/activate

On a clean virtual environment clone the repo and install:

git clone https://github.com/AvivNavon/AuxiLearn.git
cd AuxiLearn
pip install .

Usage

Given a bi-level optimization problem in which the upper-level parameters (i.e., auxiliary parameters) are only implicitly affecting the upper-level objective, you can use auxilearn to compute the upper-level gradients through implicit differentiation.

The main code component you will need to use is auxilearn.optim.MetaOptimizer. It is a wrapper over PyTorch optimizers that updates its parameters through implicit differentiation.

Code example

We assume two models, primary_model and auxiliary_model, and two dataloaders. The primary_model is optimized using the train data in the train_loader, and the auxiliary_model is optimized using the auxiliary set in the aux_loader. We assume a loss_fuction that return the train loss if train=True, or auxiliary set loss if train=False. Also, we assume the training loss is a function of both the primary parameters and the auxiliary parameters, and that the loss on the auxiliary set (or validation set) is a function of the primary parameters only. In Auxiliary Learning, the auxiliary set loss is the loss on the main task (see paper for more details).

from auxilearn.optim import MetaOptimizer

primary_model = MyModel()
auxiliary_model = MyAuxiliaryModel()
# optimizers
primary_optimizer = torch.optim.Adam(primary_model.parameters())

aux_lr = 1e-4
aux_base_optimizer = torch.optim.Adam(auxiliary_model.parameters(), lr=aux_lr)
aux_optimizer = MetaOptimizer(aux_base_optimizer, hpo_lr=aux_lr)

# training loop
step = 0
for epoch in range(epochs):
    for batch in train_loder:
        step += 1
        # calculate batch loss using 'primary_model' and 'auxiliary_model'
        primary_optimizer.zero_grad()
        loss = loss_func(train=True)
        # update primary parameters
        loss.backward()
        primary_optimizer.step()
        
        # condition for updating auxiliary parameters
        if step % aux_params_update_every == 0:
            # calc current train loss
            train_set_loss = loss_func(train=True)
            # calc current auxiliary set loss - this is the loss over the main task
            auxiliary_set_loss = loss_func(train=False) 
            
            # update auxiliary parameters - no need to call loss.backwards() or aux_optimizer.zero_grad()
            aux_optimizer.step(
                val_loss=auxiliary_set_loss,
                train_loss=train_set_loss,
                aux_params=auxiliary_model.parameters(),
                parameters=primary_model.parameters(),
            )

Citation

If you find auxilearn to be useful in your own research, please consider citing the following paper:

@inproceedings{
navon2021auxiliary,
title={Auxiliary Learning by Implicit Differentiation},
author={Aviv Navon and Idan Achituve and Haggai Maron and Gal Chechik and Ethan Fetaya},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=n7wIfYPdVet}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].