- the folder to store the data
-
data_preprocessing: processing tools
-
model: the folder to store the model (parent models) and training tools
-
standalone: the folder to store the different algorithms
-
client.py: the functions executed in client such as training and pruning
-
api.py: the functions executed in the server, i.e. aggregation, and the whole logic in algorithm.
-
model_trainer.py: store the class of model for corresponding algorithm
-
-
utils: the folder to store the tools for logging, FLOPS computation, etc.
- the folder to store the core functions of FedML
- main running interface for different baselines, where .sh files stored.
replace work_dir
/nfs/da-dpfl/
in config.yaml with the root of your project
/your_path_to_project/
pip3 install -r requirements.txt
sh setup_permission.sh
- sh /your_path_to_fedml/fedml_dadpfl/cifar10.sh
- sh /your_path_to_fedml/fedml_dispfl/cifar10.sh
- https://github.com/diaoenmao/Pruning-Deep-Neural-Networks-from-a-Sparsity-Perspective
- https://github.com/rong-dai/DisPFL
- https://github.com/liboyue/beer
If you find this work useful, please consider citing:
@ARTICLE{11060892,
author={Long, Qianyu and Wang, Qiyuan and Anagnostopoulos, Christos and Bi, Daning},
journal={IEEE Transactions on Neural Networks and Learning Systems},
title={Decentralized Personalized Federated Learning Based on a Conditional “Sparse-to-Sparser” Scheme},
year={2025},
volume={},
number={},
pages={1-15},
keywords={Training;Costs;Computational modeling;Adaptation models;Convergence;Data models;Servers;Network topology;Topology;Federated learning;Decentralized federated learning (DFL);model pruning;personalized FL (PFL);sparsification},
doi={10.1109/TNNLS.2025.3580277}}