All Projects → chuanting → FedDA

chuanting / FedDA

Licence: other
Source code for 'Dual Attention Based FL for Wireless Traffic Prediction'

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to FedDA

substra
Substra is a framework for traceable ML orchestration on decentralized sensitive data.
Stars: ✭ 143 (+248.78%)
Mutual labels:  federated-learning
Federated-Learning-Mini-Framework
Federated Learning mini-framework with Keras
Stars: ✭ 38 (-7.32%)
Mutual labels:  federated-learning
Challenge
The repo for the FeTS Challenge
Stars: ✭ 21 (-48.78%)
Mutual labels:  federated-learning
FedReID
Implementation of Federated Learning to Person Re-identification (Code for ACMMM 2020 paper)
Stars: ✭ 68 (+65.85%)
Mutual labels:  federated-learning
FedLab-benchmarks
Standard federated learning implementations in FedLab and FL benchmarks.
Stars: ✭ 49 (+19.51%)
Mutual labels:  federated-learning
FATE-Serving
A scalable, high-performance serving system for federated learning models
Stars: ✭ 107 (+160.98%)
Mutual labels:  federated-learning
GrouProx
FedGroup, A Clustered Federated Learning framework based on Tensorflow
Stars: ✭ 20 (-51.22%)
Mutual labels:  federated-learning
Awesome Mlops
A curated list of references for MLOps
Stars: ✭ 7,119 (+17263.41%)
Mutual labels:  federated-learning
PFL-Non-IID
The origin of the Non-IID phenomenon is the personalization of users, who generate the Non-IID data. With Non-IID (Not Independent and Identically Distributed) issues existing in the federated learning setting, a myriad of approaches has been proposed to crack this hard nut. In contrast, the personalized federated learning may take the advantage…
Stars: ✭ 58 (+41.46%)
Mutual labels:  federated-learning
federated-xgboost
Federated gradient boosted decision tree learning
Stars: ✭ 39 (-4.88%)
Mutual labels:  federated-learning
MOON
Model-Contrastive Federated Learning (CVPR 2021)
Stars: ✭ 93 (+126.83%)
Mutual labels:  federated-learning
backdoors101
Backdoors Framework for Deep Learning and Federated Learning. A light-weight tool to conduct your research on backdoors.
Stars: ✭ 181 (+341.46%)
Mutual labels:  federated-learning
Awesome-Federated-Machine-Learning
Everything about federated learning, including research papers, books, codes, tutorials, videos and beyond
Stars: ✭ 190 (+363.41%)
Mutual labels:  federated-learning
fedpa
Federated posterior averaging implemented in JAX
Stars: ✭ 38 (-7.32%)
Mutual labels:  federated-learning
baai-federated-learning-crane-baseline
电力人工智能数据竞赛——液压吊车目标检测赛道
Stars: ✭ 17 (-58.54%)
Mutual labels:  federated-learning
pFedMe
Personalized Federated Learning with Moreau Envelopes (pFedMe) using Pytorch (NeurIPS 2020)
Stars: ✭ 196 (+378.05%)
Mutual labels:  federated-learning
easyFL
An experimental platform to quickly realize and compare with popular centralized federated learning algorithms. A realization of federated learning algorithm on fairness (FedFV, Federated Learning with Fair Averaging, https://fanxlxmu.github.io/publication/ijcai2021/) was accepted by IJCAI-21 (https://www.ijcai.org/proceedings/2021/223).
Stars: ✭ 104 (+153.66%)
Mutual labels:  federated-learning
Pysyft
A library for answering questions using data you cannot see
Stars: ✭ 7,811 (+18951.22%)
Mutual labels:  federated-learning
Fate
An Industrial Grade Federated Learning Framework
Stars: ✭ 3,775 (+9107.32%)
Mutual labels:  federated-learning
Front-End
Federated Learning based Deep Learning. Docs: https://fets-ai.github.io/Front-End/
Stars: ✭ 35 (-14.63%)
Mutual labels:  federated-learning

Introduction

This is the source code for our paper entitled 'Dual Attention-Based Federated Learning for Wireless Traffic Prediction'.

Datasets

First one Milano and second one Trentino

Runnable scripts

You can also find an runnable code of our paper at Codeocean

Abstract

Wireless traffic prediction is essential for cellular networks to realize intelligent network operations, such as loadaware resource management and predictive control. Existing prediction approaches usually adopt centralized training architectures and require the transferring of huge amounts of traffic data, which may raise delay and privacy concerns for certain scenarios. In this work, we propose a novel wireless traffic prediction framework named Dual Attention-Based Federated Learning (FedDA), by which a high-quality prediction model is trained collaboratively by multiple edge clients. To simultaneously capture the various wireless traffic patterns and keep raw data locally, FedDA first groups the clients into different clusters by using a small augmentation dataset. Then, a quasi-global model is trained and shared among clients as prior knowledge, aiming to solve the statistical heterogeneity challenge confronted with federated learning. To construct the global model, a dual attention scheme is further proposed by aggregating the intra and inter-cluster models, instead of simply averaging the weights of local models. We conduct extensive experiments on two real-world wireless traffic datasets and results show that FedDA outperforms state-of-the-art methods. The average mean squared error performance gains on the two datasets are up to 10% and 30%, respectively.

System Model

FedDA system model

Update Rules

Federated Optimization

Paper

If you find the code useful, please cite our paper in your work as follows:

C. Zhang, S. Dang, B. Shihada and M. -S. Alouini, "Dual Attention-Based Federated Learning for Wireless Traffic Prediction," IEEE INFOCOM 2021 - IEEE Conference on Computer Communications, 2021, pp. 1-10.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].