Simulation framework for accelerating research in Private Federated Learning
APACHE-2.0 License
pfl
: Python framework for Private Federated Learning simulationsDocumentation website: https://apple.github.io/pfl-research
pfl
is a Python framework developed at Apple to empower researchers to run efficient simulations with privacy-preserving federated learning (FL) and disseminate the results of their research in FL. We are a team comprising engineering and research expertise, and we encourage researchers to publish their papers, with this code, with confidence.
The framework is not
intended to be used for third-party FL deployments but the results of the simulations can be tremendously useful in actual FL deployments.
We hope that pfl
will promote open research in FL and its effective dissemination.
pfl
provides several useful features, including the following:
pfl
has flexible APIs to express these ideas.Results from benchmarks are maintained in this Weights & Biases report.
Installation instructions can be found here.
pfl
is available on PyPI and a full installation be done with pip:
pip install 'pfl[tf,pytorch,trees]'
To try out pfl
immediately without installation, we provide several colab notebooks for learning the different components in pfl
hands-on.
We also support MLX!
But you have to run this notebook locally on your Apple silicon, see all Jupyter notebooks available here.
pfl
aims to streamline the benchmarking process of testing hypotheses in the Federated Learning paradigm. The official benchmarks are available in the benchmarks directory, using a variety of realistic dataset-model combinations with and without differential privacy (yes, we do also have CIFAR10).
Copying these examples is a great starting point for doing your own research. See the quickstart on how to start converging a model on the simplest benchmark (CIFAR10) in just a few minutes.
Researchers are invited to contribute to the framework. Please, see here for more details.
@article{granqvist2024pfl,
title={pfl-research: simulation framework for accelerating research in Private Federated Learning},
author={Granqvist, Filip and Song, Congzheng and Cahill, {\'A}ine and van Dalen, Rogier and Pelikan, Martin and Chan, Yi Sheng and Feng, Xiaojun and Krishnaswami, Natarajan and Jina, Vojta and Chitnis, Mona},
journal={arXiv preprint arXiv:2404.06430},
year={2024},
}