All Projects → dilawarm → federated

dilawarm / federated

Licence: CC-BY-4.0 license
Bachelor's Thesis in Computer Science: Privacy-Preserving Federated Learning Applied to Decentralized Data

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to federated

Awesome-Federated-Machine-Learning
Everything about federated learning, including research papers, books, codes, tutorials, videos and beyond
Stars: ✭ 190 (+660%)
Mutual labels:  differential-privacy, federated-learning, privacy-preserving-machine-learning
federated pca
Federated Principal Component Analysis Revisited!
Stars: ✭ 30 (+20%)
Mutual labels:  differential-privacy, federated-learning
srijan-gsoc-2020
Healthcare-Researcher-Connector Package: Federated Learning tool for bridging the gap between Healthcare providers and researchers
Stars: ✭ 17 (-32%)
Mutual labels:  differential-privacy, federated-learning
decentralized-ml
Full stack service enabling decentralized machine learning on private data
Stars: ✭ 50 (+100%)
Mutual labels:  federated-learning, privacy-preserving-machine-learning
PFL-Non-IID
The origin of the Non-IID phenomenon is the personalization of users, who generate the Non-IID data. With Non-IID (Not Independent and Identically Distributed) issues existing in the federated learning setting, a myriad of approaches has been proposed to crack this hard nut. In contrast, the personalized federated learning may take the advantage…
Stars: ✭ 58 (+132%)
Mutual labels:  differential-privacy, federated-learning
awesome-secure-computation
Awesome list for cryptographic secure computation paper. This repo includes *Lattice*, *DifferentialPrivacy*, *MPC* and also a comprehensive summary for top conferences.
Stars: ✭ 125 (+400%)
Mutual labels:  homomorphic-encryption, differential-privacy
federated-learning-poc
Proof of Concept of a Federated Learning framework that maintains the privacy of the participants involved.
Stars: ✭ 13 (-48%)
Mutual labels:  homomorphic-encryption, federated-learning
ecelgamal
Additive homomorphic EC-ElGamal
Stars: ✭ 19 (-24%)
Mutual labels:  homomorphic-encryption
WeDPR-Lab-iOS-SDK
iOS SDK of WeDPR-Lab-Core; WeDPR即时可用场景式隐私保护高效解决方案核心算法组件iOS SDK
Stars: ✭ 13 (-48%)
Mutual labels:  homomorphic-encryption
encrypted-geofence
testing out homomorphic encryption
Stars: ✭ 33 (+32%)
Mutual labels:  homomorphic-encryption
gomorph
Implementing Homomorphic Encryption in Golang 🌱
Stars: ✭ 76 (+204%)
Mutual labels:  homomorphic-encryption
he-toolkit
The Intel Homomorphic Encryption (HE) toolkit is the primordial vehicle for the continuous distribution of the Intel HE technological innovation to users. The toolkit has been designed with usability in mind and to make it easier for users to evaluate and deploy homomorphic encryption technology on the Intel platforms.
Stars: ✭ 40 (+60%)
Mutual labels:  homomorphic-encryption
node-seal
Homomorphic Encryption for TypeScript or JavaScript - Microsoft SEAL
Stars: ✭ 139 (+456%)
Mutual labels:  homomorphic-encryption
threshold-signatures
Threshold Signature Scheme for ECDSA
Stars: ✭ 79 (+216%)
Mutual labels:  homomorphic-encryption
FedScale
FedScale is a scalable and extensible open-source federated learning (FL) platform.
Stars: ✭ 274 (+996%)
Mutual labels:  federated-learning
concrete-numpy
Concrete Numpy is a python package that contains the tools data scientists need to compile various numpy functions into their Fully Homomorphic Encryption (FHE) equivalents. Concrete Numpy goes on top of the Concrete Library and its Compiler.
Stars: ✭ 111 (+344%)
Mutual labels:  homomorphic-encryption
libshe
Symmetric somewhat homomorphic encryption library based on DGHV
Stars: ✭ 24 (-4%)
Mutual labels:  homomorphic-encryption
Awesome-Federated-Learning-on-Graph-and-GNN-papers
Federated learning on graph, especially on graph neural networks (GNNs), knowledge graph, and private GNN.
Stars: ✭ 206 (+724%)
Mutual labels:  federated-learning
DeML-Golem
Proof Of Concept of DEcentralised Machine Learning on top of the Golem (https://golem.network/) architecture
Stars: ✭ 35 (+40%)
Mutual labels:  federated-learning
FedDA
Source code for 'Dual Attention Based FL for Wireless Traffic Prediction'
Stars: ✭ 41 (+64%)
Mutual labels:  federated-learning

License: CC BY 4.0 firebase-hosting test-and-format

federated is the source code for the Bachelor's Thesis

Privacy-Preserving Federated Learning Applied to Decentralized Data (Spring 2021, NTNU)

Federated learning (also known as collaborative learning) is a machine learning technique that trains an algorithm across multiple decentralized edge devices or servers holding local data samples, without exchanging them. In this project, the decentralized data is the MIT-BIH Arrhythmia Database.

Table of Contents

Features

  • ML pipelines using centralized learning or federated learning.
  • Support for the following aggregation methods:
    • Federated Stochastic Gradient Descent (FedSGD)
    • Federated Averaging (FedAvg)
    • Differentially-Private Federated Averaging (DP-FedAvg)
    • Federated Averaging with Homomorphic Encryption
    • Robust Federated Aggregation (RFA)
  • Support for the following models:
    • A simple softmax regressor
    • A feed-forward neural network (ANN)
    • A convolutional neural network (CNN)
  • Model compression in federated learning.

Installation

Prerequisites

Initial Setup

1. Cloning federated

$ git clone https://github.com/dilawarm/federated.git
$ cd federated

2. Getting the Dataset

To download the MIT-BIH Arrhythmia Database dataset used in this project, go to https://www.kaggle.com/shayanfazeli/heartbeat and download the files

  • mitbih_train.csv
  • mitbih_test.csv

Then write:

mkdir data
mkdir data/mitbih

and move the downloaded data into the data/mitbih folder.

Installing federated locally

1. Install the Python development environment

On Ubuntu:

$ sudo apt update
$ sudo apt install python3-dev python3-pip  # Python 3.8
$ sudo apt install build-essential          # make
$ sudo pip3 install --user --upgrade virtualenv

On macOS:

$ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
$ export PATH="/usr/local/bin:/usr/local/sbin:$PATH"
$ brew update
$ brew install python  # Python 3.8
$ brew install make    # make
$ sudo pip3 install --user --upgrade virtualenv

2. Create a virtual environment

$ virtualenv --python python3 "venv"
$ source "venv/bin/activate"
(venv) $ pip install --upgrade pip

3. Install the dependencies

(venv) $ make install

4. Test TensorFlow Federated

(venv) $ python -c "import tensorflow_federated as tff; print(tff.federated_computation(lambda: 'Hello World')())"

Installing with Docker (optional)

Build and run image from Dockerfile

$ make docker

Running experiments with federated

federated has a client program, where one can initialize the different pipelines and train models with centralized or federated learning. To run this client program:

(venv) $ make help

This will display a list of options:

usage: python -m federated.main [-h] -l  -n  [-e] [-op] [-b] [-o] -m  [-lr]

Experimentation pipeline for federated 🚀

optional arguments:
  -b , --batch_size     The batch size. (default: 32)
  -e , --epochs         Number of global epochs. (default: 15)
  -h, --help            show this help message and exit
  -l , --learning_approach
                        Learning apporach (centralized, federated). (default: None)
  -lr , --learning_rate
                        Learning rate for server optimizer. (default: 1.0)
  -m , --model          The model to be trained with the learning approach (ann, softmax_regression, cnn). (default: None)
  -n , --experiment_name
                        The name of the experiment. (default: None)
  -o , --output         Path to the output folder where the experiment is going to be saved. (default: history)
  -op , --optimizer     Server optimizer (adam, sgd). (default: sgd)

Here is an example on how to train a cnn model with federated learning for 10 global epochs using the SGD server-optimizer with a learning rate of 0.01:

(venv) $ python -m federated.main --learning_approach federated --model cnn --epochs 10 --optimizer sgd --learning_rate 0.01 --experiment_name experiment_name --output path/to/experiments

Running the command illustrated above, will display a list of input fields where one can fill in more information about the training configuration, such as aggregation method, if differential privacy should be used etc. Once all training configurations have been decided, the pipeline will be initialized. All logs and training configurations will be stored in the folder path/to/experiments/logdir/experiment_name.

Analyzing experiments with federated

TensorBoard

To analyze the results with TensorBoard:

(venv) $ tensorboard --logdir=path/to/experiments/logdir/experiment_name --port=6060

Jupyter Notebook

To analyze the results in the ModelAnalysis notebook, open the notebook with your editor. For example:

(venv) $ code notebooks/ModelAnalysis.ipynb

Replace the first line in this notebook with the absolute path to your experiment folder, and run the notebook to see the results.

Documentation

The documentation can be found here.

To generate the documentation locally:

(venv) $ cd docs
(venv) $ make html
(venv) $ firefox _build/html/index.html

Tests

The unit tests included in federated are:

  • Tests for data preprocessing
  • Tests for different machine learning models
  • Tests for the training loops
  • Tests for the different privacy algorithms such as RFA.

To run all the tests:

(venv) $ make tests

To generate coverage after running the tests:

(venv) $ coverage html
(venv) $ firefox htmlcov/index.html

See the Makefile for more commands to test the modules in federated separately.

How to Contribute

  1. Clone repo and create a new branch:
$ git checkout https://github.com/dilawarm/federated.git -b name_for_new_branch
  1. Make changes and test.
  2. Submit Pull Request with comprehensive description of changes.

Owners

Pernille Kopperud Dilawar Mahmood

Enjoy! 🙂

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].