All Projects → alshedivat → fedpa

alshedivat / fedpa

Licence: Apache-2.0 license
Federated posterior averaging implemented in JAX

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to fedpa

GPJax
A didactic Gaussian process package for researchers in Jax.
Stars: ✭ 159 (+318.42%)
Mutual labels:  jax
Federated-Learning-and-Split-Learning-with-raspberry-pi
SRDS 2020: End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things
Stars: ✭ 54 (+42.11%)
Mutual labels:  federated-learning
GrouProx
FedGroup, A Clustered Federated Learning framework based on Tensorflow
Stars: ✭ 20 (-47.37%)
Mutual labels:  federated-learning
flPapers
Paper collection of federated learning. Conferences and Journals Collection for Federated Learning from 2019 to 2021, Accepted Papers, Hot topics and good research groups. Paper summary
Stars: ✭ 76 (+100%)
Mutual labels:  federated-learning
srijan-gsoc-2020
Healthcare-Researcher-Connector Package: Federated Learning tool for bridging the gap between Healthcare providers and researchers
Stars: ✭ 17 (-55.26%)
Mutual labels:  federated-learning
jaxfg
Factor graphs and nonlinear optimization for JAX
Stars: ✭ 124 (+226.32%)
Mutual labels:  jax
uvadlc notebooks
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2022/Spring 2022
Stars: ✭ 901 (+2271.05%)
Mutual labels:  jax
substra
Substra is a framework for traceable ML orchestration on decentralized sensitive data.
Stars: ✭ 143 (+276.32%)
Mutual labels:  federated-learning
dm pix
PIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+613.16%)
Mutual labels:  jax
decentralized-ml
Full stack service enabling decentralized machine learning on private data
Stars: ✭ 50 (+31.58%)
Mutual labels:  federated-learning
PyVertical
Privacy Preserving Vertical Federated Learning
Stars: ✭ 133 (+250%)
Mutual labels:  federated-learning
efficientnet-jax
EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+200%)
Mutual labels:  jax
communication-in-cross-silo-fl
Official code for "Throughput-Optimal Topology Design for Cross-Silo Federated Learning" (NeurIPS'20)
Stars: ✭ 19 (-50%)
Mutual labels:  federated-learning
distributed-learning-contributivity
Simulate collaborative ML scenarios, experiment multi-partner learning approaches and measure respective contributions of different datasets to model performance.
Stars: ✭ 49 (+28.95%)
Mutual labels:  federated-learning
pFedMe
Personalized Federated Learning with Moreau Envelopes (pFedMe) using Pytorch (NeurIPS 2020)
Stars: ✭ 196 (+415.79%)
Mutual labels:  federated-learning
PyAriesFL
Federated Learning on HyperLedger Aries
Stars: ✭ 19 (-50%)
Mutual labels:  federated-learning
federated-learning-poc
Proof of Concept of a Federated Learning framework that maintains the privacy of the participants involved.
Stars: ✭ 13 (-65.79%)
Mutual labels:  federated-learning
robustness-vit
Contains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+105.26%)
Mutual labels:  jax
chef-transformer
Chef Transformer 🍲 .
Stars: ✭ 29 (-23.68%)
Mutual labels:  jax
wax-ml
A Python library for machine-learning and feedback loops on streaming data
Stars: ✭ 36 (-5.26%)
Mutual labels:  jax

Federated Learning via Posterior Averaging

Open In Colab Code style: black

FedAvg vs. FedPA

This repository contains a minimalistic (yet general and modular) JAX implementation of federated posterior averaging (FedPA) algorithm along with a variety of simulation experiments on synthetically generated problems.

Usage

The easiest way to reproduce our synthetic experiments and/or compare FedAvg and FedPA is by using the colab notebook provided in this repository (simply click the Open in Colab button at the top of README). If you would like to use our JAX implementation of FedAvg or FedPA elsewhere, the federated/ folder can be used a standalone Python package.

Organization of the code

All the code is located under the federated folder and organized into multiple sub-modules:

  • objectives: Contains implementations of synthetic objective functions. We assume that each client is represented by the corresponding objective. The objective is base on the client's data and can be (1) evaluated or (2) return a (stochastic) gradient at a given point (or parameter vector).
  • inference: Contains functions for computing client updates, running SGD (and its variations), doing posterior sampling, estimating moments of a distribution from samples.
  • learning: Contains implementations of the learning algorithms (FedAvg, FedPA, and their variations.)
  • utils: Contains various utility functions, e.g., for timing code or plotting figures.

A note on implementation. Our minimalistic library is implemented in a functional style. The main learning function is fed_opt (defined in federated.learning.algorithms), which implements the generalized federated optimization (corresponds to Algorithm 1 in our paper, originally proposed by Reddi*, Charles*, et al. (2020)). fed_opt takes client_update_fn and server_update_fn functions as arguments, which are used for computing client and server updates, respectively. FedAvg and FedPA are implemented by providing the corresponding client_update_fn and server_update_fn arguments to fed_opt.

Reproducing results on FL benchmark tasks

This mini-library does NOT support running experiments on FL benchmark tasks such as EMNIST, CIFAR100, etc. If you would like to run FedPA on these benchmarks, please use our TFF implementation.

Citation

@inproceedings{alshedivat2021federated,
  title={Federated Learning via Posterior Averaging: A New Perspective and Practical Algorithms},
  author={Al-Shedivat, Maruan and Gillenwater, Jennifer and Xing, Eric and Rostamizadeh, Afshin},
  booktitle={International Conference on Learning Representations (ICLR)},
  year={2021}
}

License

Apache 2.0

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].