All Projects → QinbinLi → MOON

QinbinLi / MOON

Licence: MIT license
Model-Contrastive Federated Learning (CVPR 2021)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to MOON

CCL
PyTorch Implementation on Paper [CVPR2021]Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Stars: ✭ 76 (-18.28%)
Mutual labels:  contrastive-learning, cvpr2021
FedReID
Implementation of Federated Learning to Person Re-identification (Code for ACMMM 2020 paper)
Stars: ✭ 68 (-26.88%)
Mutual labels:  federated-learning
GrouProx
FedGroup, A Clustered Federated Learning framework based on Tensorflow
Stars: ✭ 20 (-78.49%)
Mutual labels:  federated-learning
MiVOS
[CVPR 2021] Modular Interactive Video Object Segmentation: Interaction-to-Mask, Propagation and Difference-Aware Fusion. Semi-supervised VOS as well!
Stars: ✭ 302 (+224.73%)
Mutual labels:  cvpr2021
VarCLR
VarCLR: Variable Semantic Representation Pre-training via Contrastive Learning
Stars: ✭ 30 (-67.74%)
Mutual labels:  contrastive-learning
G-SimCLR
This is the code base for paper "G-SimCLR : Self-Supervised Contrastive Learning with Guided Projection via Pseudo Labelling" by Souradip Chakraborty, Aritra Roy Gosthipaty and Sayak Paul.
Stars: ✭ 69 (-25.81%)
Mutual labels:  contrastive-learning
communication-in-cross-silo-fl
Official code for "Throughput-Optimal Topology Design for Cross-Silo Federated Learning" (NeurIPS'20)
Stars: ✭ 19 (-79.57%)
Mutual labels:  federated-learning
RSTNet
RSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (-23.66%)
Mutual labels:  cvpr2021
fedpa
Federated posterior averaging implemented in JAX
Stars: ✭ 38 (-59.14%)
Mutual labels:  federated-learning
SimCLR
Pytorch implementation of "A Simple Framework for Contrastive Learning of Visual Representations"
Stars: ✭ 65 (-30.11%)
Mutual labels:  contrastive-learning
LPDC-Net
CVPR2021 paper "Learning Parallel Dense Correspondence from Spatio-Temporal Descriptorsfor Efficient and Robust 4D Reconstruction"
Stars: ✭ 27 (-70.97%)
Mutual labels:  cvpr2021
efficient-annotation-cookbook
Official implementation of "Towards Good Practices for Efficiently Annotating Large-Scale Image Classification Datasets" (CVPR2021)
Stars: ✭ 54 (-41.94%)
Mutual labels:  cvpr2021
substra
Substra is a framework for traceable ML orchestration on decentralized sensitive data.
Stars: ✭ 143 (+53.76%)
Mutual labels:  federated-learning
semantic-guidance
Code for our CVPR-2021 paper on Combining Semantic Guidance and Deep Reinforcement Learning For Generating Human Level Paintings.
Stars: ✭ 19 (-79.57%)
Mutual labels:  cvpr2021
contrastive loss
Experiments with supervised contrastive learning methods with different loss functions
Stars: ✭ 143 (+53.76%)
Mutual labels:  contrastive-learning
decentralized-ml
Full stack service enabling decentralized machine learning on private data
Stars: ✭ 50 (-46.24%)
Mutual labels:  federated-learning
Scon-ABSA
[CIKM 2021] Enhancing Aspect-Based Sentiment Analysis with Supervised Contrastive Learning
Stars: ✭ 17 (-81.72%)
Mutual labels:  contrastive-learning
pFedMe
Personalized Federated Learning with Moreau Envelopes (pFedMe) using Pytorch (NeurIPS 2020)
Stars: ✭ 196 (+110.75%)
Mutual labels:  federated-learning
cl-ica
Code for the paper "Contrastive Learning Inverts the Data Generating Process".
Stars: ✭ 65 (-30.11%)
Mutual labels:  contrastive-learning
AODA
Official implementation of "Adversarial Open Domain Adaptation for Sketch-to-Photo Synthesis"(WACV 2022/CVPRW 2021)
Stars: ✭ 44 (-52.69%)
Mutual labels:  cvpr2021

Model-Contrastive Federated Learning

This is the code for paper Model-Contrastive Federated Learning.

Abstract: Federated learning enables multiple parties to collaboratively train a machine learning model without communicating their local data. A key challenge in federated learning is to handle the heterogeneity of local data distribution across parties. Although many studies have been proposed to address this challenge, we find that they fail to achieve high performance in image datasets with deep learning models. In this paper, we propose MOON: model-contrastive federated learning. MOON is a simple and effective federated learning framework. The key idea of MOON is to utilize the similarity between model representations to correct the local training of individual parties, i.e., conducting contrastive learning in model-level. Our extensive experiments show that MOON significantly outperforms the other state-of-the-art federated learning algorithms on various image classification tasks.

Dependencies

  • PyTorch >= 1.0.0
  • torchvision >= 0.2.1
  • scikit-learn >= 0.23.1

Parameters

Parameter Description
model The model architecture. Options: simple-cnn, resnet50 .
alg The training algorithm. Options: moon, fedavg, fedprox, local_training
dataset Dataset to use. Options: cifar10. cifar100, tinyimagenet
lr Learning rate.
batch-size Batch size.
epochs Number of local epochs.
n_parties Number of parties.
sample_fraction the fraction of parties to be sampled in each round.
comm_round Number of communication rounds.
partition The partition approach. Options: noniid, iid.
beta The concentration parameter of the Dirichlet distribution for non-IID partition.
mu The parameter for MOON and FedProx.
temperature The temperature parameter for MOON.
out_dim The output dimension of the projection head.
datadir The path of the dataset.
logdir The path to store the logs.
device Specify the device to run the program.
seed The initial seed.

Usage

Here is an example to run MOON on CIFAR-10 with a simple CNN:

python main.py --dataset=cifar10 \
    --model=simple-cnn \
    --alg=moon \
    --lr=0.01 \
    --mu=5 \
    --epochs=10 \
    --comm_round=100 \
    --n_parties=10 \
    --partition=noniid \
    --beta=0.5 \
    --logdir='./logs/' \
    --datadir='./data/' \

Tiny-ImageNet

You can download Tiny-ImageNet here. Then, you can follow the instructions to reformat the validation folder.

Hyperparameters

If you use the same setting as our papers, you can simply adopt the hyperparameters reported in our paper. If you try a setting different from our paper, please tune the hyperparameters of MOON. You may tune mu from {0.001, 0.01, 0.1, 1, 5, 10}. If you have sufficient computing resources, you may also tune temperature from {0.1, 0.5, 1.0} and the output dimension of projection head from {64, 128, 256}.

Citation

Please cite our paper if you find this code useful for your research.

@inproceedings{li2021model,
      title={Model-Contrastive Federated Learning}, 
      author={Qinbin Li and Bingsheng He and Dawn Song},
      booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
      year={2021},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].