All Projects → AI-secure → CRFL

AI-secure / CRFL

Licence: other
CRFL: Certifiably Robust Federated Learning against Backdoor Attacks (ICML 2021)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to CRFL

PFL-Non-IID
The origin of the Non-IID phenomenon is the personalization of users, who generate the Non-IID data. With Non-IID (Not Independent and Identically Distributed) issues existing in the federated learning setting, a myriad of approaches has been proposed to crack this hard nut. In contrast, the personalized federated learning may take the advantage…
Stars: ✭ 58 (+31.82%)
Mutual labels:  federated-learning
baai-federated-learning-crane-baseline
电力人工智能数据竞赛——液压吊车目标检测赛道
Stars: ✭ 17 (-61.36%)
Mutual labels:  federated-learning
FedDANE
FedDANE: A Federated Newton-Type Method (Asilomar Conference on Signals, Systems, and Computers ‘19)
Stars: ✭ 25 (-43.18%)
Mutual labels:  federated-learning
easyFL
An experimental platform to quickly realize and compare with popular centralized federated learning algorithms. A realization of federated learning algorithm on fairness (FedFV, Federated Learning with Fair Averaging, https://fanxlxmu.github.io/publication/ijcai2021/) was accepted by IJCAI-21 (https://www.ijcai.org/proceedings/2021/223).
Stars: ✭ 104 (+136.36%)
Mutual labels:  federated-learning
federated-xgboost
Federated gradient boosted decision tree learning
Stars: ✭ 39 (-11.36%)
Mutual labels:  federated-learning
Awesome Mlops
A curated list of references for MLOps
Stars: ✭ 7,119 (+16079.55%)
Mutual labels:  federated-learning
backdoors101
Backdoors Framework for Deep Learning and Federated Learning. A light-weight tool to conduct your research on backdoors.
Stars: ✭ 181 (+311.36%)
Mutual labels:  federated-learning
FedScale
FedScale is a scalable and extensible open-source federated learning (FL) platform.
Stars: ✭ 274 (+522.73%)
Mutual labels:  federated-learning
Challenge
The repo for the FeTS Challenge
Stars: ✭ 21 (-52.27%)
Mutual labels:  federated-learning
DeML-Golem
Proof Of Concept of DEcentralised Machine Learning on top of the Golem (https://golem.network/) architecture
Stars: ✭ 35 (-20.45%)
Mutual labels:  federated-learning
FATE-Serving
A scalable, high-performance serving system for federated learning models
Stars: ✭ 107 (+143.18%)
Mutual labels:  federated-learning
Front-End
Federated Learning based Deep Learning. Docs: https://fets-ai.github.io/Front-End/
Stars: ✭ 35 (-20.45%)
Mutual labels:  federated-learning
Pysyft
A library for answering questions using data you cannot see
Stars: ✭ 7,811 (+17652.27%)
Mutual labels:  federated-learning
Federated-Learning-Mini-Framework
Federated Learning mini-framework with Keras
Stars: ✭ 38 (-13.64%)
Mutual labels:  federated-learning
NIID-Bench
Federated Learning on Non-IID Data Silos: An Experimental Study (ICDE 2022)
Stars: ✭ 304 (+590.91%)
Mutual labels:  federated-learning
FedLab-benchmarks
Standard federated learning implementations in FedLab and FL benchmarks.
Stars: ✭ 49 (+11.36%)
Mutual labels:  federated-learning
Fate
An Industrial Grade Federated Learning Framework
Stars: ✭ 3,775 (+8479.55%)
Mutual labels:  federated-learning
FedFusion
The implementation of "Towards Faster and Better Federated Learning: A Feature Fusion Approach" (ICIP 2019)
Stars: ✭ 30 (-31.82%)
Mutual labels:  federated-learning
Awesome-Federated-Learning-on-Graph-and-GNN-papers
Federated learning on graph, especially on graph neural networks (GNNs), knowledge graph, and private GNN.
Stars: ✭ 206 (+368.18%)
Mutual labels:  federated-learning
FedDA
Source code for 'Dual Attention Based FL for Wireless Traffic Prediction'
Stars: ✭ 41 (-6.82%)
Mutual labels:  federated-learning

CRFL

In this repository, code is for our ICML 2021 paper CRFL: Certifiably Robust Federated Learning against Backdoor Attacks

Installation

  1. Create a virtual environment via conda.

    conda create -n crfl python=3.6
    source activate crfl
  2. Install torch and torchvision according to your CUDA Version and the instructions at PyTorch. For example,

    conda install pytorch cudatoolkit=10.1 torchvision -c pytorch
  3. Install requirements.

    pip install -r requirements.txt

Dataset

  1. MNIST and EMNIST: MNIST and EMNIST datasets will be automatically downloaded into the dir ./data during training or testing.

  2. LOAN: Download the raw dataset loan.csv from Google Drive into the dir ./data.
    Run

    python utils/loan_preprocess.py

    We will get 51 csv files in ./data/loan/.

Get Started

  1. First, we training the FL models on the three datasets:
python main.py --params configs/mnist_params.yaml
python main.py --params configs/emnist_params.yaml
python main.py --params configs/loan_params.yaml

Hyperparameters can be changed according to the comments in those yaml files (configs/mnist_params.yaml,configs/emnist_params.yaml, configs/loan_params.yaml) to reproduce our experiments.

  1. Second, we perform parameter smoothing for the global models on the three datasets:
python smooth_mnist.py
python smooth_emnist.py
python smooth_loan.py

The filepaths of models can be changed in those yaml files (configs/mnist_smooth_params.yaml,configs/emnist_smooth_params.yaml, configs/loan_smooth_params.yaml) .

  1. Third, we plot the certified accuracy and certified rate for the three datasets:
python certify_mnist.py
python certify_emnist.py
python certify_loan.py

Citation

If you find our work useful in your research, please consider citing:

@InProceedings{pmlr-v139-xie21a,
  title = 	 {CRFL: Certifiably Robust Federated Learning against Backdoor Attacks},
  author =       {Xie, Chulin and Chen, Minghao and Chen, Pin-Yu and Li, Bo},
  booktitle = 	 {Proceedings of the 38th International Conference on Machine Learning},
  pages = 	 {11372--11382},
  year = 	 {2021},
  volume = 	 {139},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {18--24 Jul},
  publisher =    {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v139/xie21a/xie21a.pdf},
  url = 	 {http://proceedings.mlr.press/v139/xie21a.html},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].