All Projects → Minki-Kim95 → Federated-Learning-and-Split-Learning-with-raspberry-pi

Minki-Kim95 / Federated-Learning-and-Split-Learning-with-raspberry-pi

Licence: MIT license
SRDS 2020: End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Federated-Learning-and-Split-Learning-with-raspberry-pi

PFL-Non-IID
The origin of the Non-IID phenomenon is the personalization of users, who generate the Non-IID data. With Non-IID (Not Independent and Identically Distributed) issues existing in the federated learning setting, a myriad of approaches has been proposed to crack this hard nut. In contrast, the personalized federated learning may take the advantage…
Stars: ✭ 58 (+7.41%)
Mutual labels:  distributed-computing, federated-learning
easyFL
An experimental platform to quickly realize and compare with popular centralized federated learning algorithms. A realization of federated learning algorithm on fairness (FedFV, Federated Learning with Fair Averaging, https://fanxlxmu.github.io/publication/ijcai2021/) was accepted by IJCAI-21 (https://www.ijcai.org/proceedings/2021/223).
Stars: ✭ 104 (+92.59%)
Mutual labels:  distributed-computing, federated-learning
Awesome-Federated-Machine-Learning
Everything about federated learning, including research papers, books, codes, tutorials, videos and beyond
Stars: ✭ 190 (+251.85%)
Mutual labels:  distributed-computing, federated-learning
gordo
An API-first distributed deployment system of deep learning models using timeseries data to predict the behaviour of systems
Stars: ✭ 25 (-53.7%)
Mutual labels:  distributed-computing
hydra-hpp
Hydra Hot Potato Player (game)
Stars: ✭ 12 (-77.78%)
Mutual labels:  distributed-computing
Archived-SANSA-Query
SANSA Query Layer
Stars: ✭ 31 (-42.59%)
Mutual labels:  distributed-computing
good-karma-kit
😇 A Docker Compose bundle to run on servers with spare CPU, RAM, disk, and bandwidth to help the world. Includes Tor, ArchiveWarrior, BOINC, and more...
Stars: ✭ 238 (+340.74%)
Mutual labels:  distributed-computing
PyAriesFL
Federated Learning on HyperLedger Aries
Stars: ✭ 19 (-64.81%)
Mutual labels:  federated-learning
marsjs
Label images from Unsplash in browser - using MobileNet on Tensorflow.Js
Stars: ✭ 53 (-1.85%)
Mutual labels:  distributed-computing
PyVertical
Privacy Preserving Vertical Federated Learning
Stars: ✭ 133 (+146.3%)
Mutual labels:  federated-learning
tutorial
Tutorials to help you build your first Swim app
Stars: ✭ 27 (-50%)
Mutual labels:  distributed-computing
distributed-learning-contributivity
Simulate collaborative ML scenarios, experiment multi-partner learning approaches and measure respective contributions of different datasets to model performance.
Stars: ✭ 49 (-9.26%)
Mutual labels:  federated-learning
srijan-gsoc-2020
Healthcare-Researcher-Connector Package: Federated Learning tool for bridging the gap between Healthcare providers and researchers
Stars: ✭ 17 (-68.52%)
Mutual labels:  federated-learning
machinaris
An easy-to-use WebUI for crypto plotting and farming. Offers Plotman, MadMax, Chiadog, Bladebit, Farmr, and Forktools in a Docker container. Supports Chia, MMX, Chives, Flax, HDDCoin, and BPX among others.
Stars: ✭ 324 (+500%)
Mutual labels:  distributed-computing
dask-pytorch-ddp
dask-pytorch-ddp is a Python package that makes it easy to train PyTorch models on dask clusters using distributed data parallel.
Stars: ✭ 50 (-7.41%)
Mutual labels:  distributed-computing
ParallelUtilities.jl
Fast and easy parallel mapreduce on HPC clusters
Stars: ✭ 28 (-48.15%)
Mutual labels:  distributed-computing
high-assurance-legacy
Legacy code connected to the high-assurance implementation of the Ouroboros protocol family
Stars: ✭ 81 (+50%)
Mutual labels:  distributed-computing
dcf
Yet another distributed compute framework
Stars: ✭ 48 (-11.11%)
Mutual labels:  distributed-computing
flPapers
Paper collection of federated learning. Conferences and Journals Collection for Federated Learning from 2019 to 2021, Accepted Papers, Hot topics and good research groups. Paper summary
Stars: ✭ 76 (+40.74%)
Mutual labels:  federated-learning
plinycompute
A system for development of high-performance, data-intensive, distributed computing, applications, tools, and libraries.
Stars: ✭ 27 (-50%)
Mutual labels:  distributed-computing

Federated Learning and Split Learning with raspberry pie

This is for releasing the source code of the SRDS 2020 paper "End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things".

If you find it is useful and used for publication. Please kindly cite our work as:

@inproceedings{gao2020end,
title={End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things},
author={Gao, Yansong and Kim, Minki and Abuadbba, Sharif and Kim, Yeonjae and Thapa, Chandra and Kim, Kyuyeon and Camtepe, Seyit A and Kim, Hyoungshick and Nepal, Surya},
booktitle={The 39th International Symposium on Reliable Distributed Systems (SRDS)},
year={2020}}

Helpful Link

Description

This repository contains the implementations of various distributed machine learning models like Federated learning, split learning and ensemble learning

Requirements(Desktop)

  • Python==3.6
  • PyTorch==1.5.1

Requirements(Raspberry pie3)

  • Python==3.7
  • PyTorch==1.0.0

Repository summary

  • models directory: has pre-processed training/testing data of MIT arrhythmia ECG database in hdf5 format. If you want, you can upload another preprocessed train and test data here.
  • federated_learning directory: source codes of federated learning in ipynb and .py format
  • split_learning directory: source codes of split learning in ipynb and .py format
  • ensemble_learning directory: source codes of ensemble learning in ipynb and .py format

How to use

1. Run client on desktop

you need to use ~client.ipynb file

2. Run client on raspberry pie

you need to use ~client_rasp.ipynb or ~client_rasp.py file If you run these files, you can see the temperature, memory usage of raspberry pie.

Overall process

set hyperparameters

  • set variable users, in server and client file
  • set variable rounds, local_epoch or epochs of training

Running code

  • Run the server code first
  • After run server, run the clients

input information

  • if you run the server, you can see the printed ip address of server
  • when you run the client you need the enter order of client and ip address
  • if there is no problem, training will be started

Project members

Gao Yansong, Kim Minki, Abuadbba Sharif, Kim Yeonjae, Thapa Chandra, Kim Kyuyeon, Camtepe Seyit A, Kim Hyoungshick, Nepal Surya

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].