# flower **Repository Path**: frontxiang/flower ## Basic Information - **Project Name**: flower - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2021-10-04 - **Last Updated**: 2022-04-03 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Flower - A Friendly Federated Learning Framework

Flower Website

Website | Blog | Docs | Conference | Slack

[![GitHub license](https://img.shields.io/github/license/adap/flower)](https://github.com/adap/flower/blob/main/LICENSE) [![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/adap/flower/blob/main/CONTRIBUTING.md) ![Build](https://github.com/adap/flower/workflows/Build/badge.svg) ![Downloads](https://pepy.tech/badge/flwr) [![Slack](https://img.shields.io/badge/Chat-Slack-red)](https://flower.dev/join-slack) Flower (`flwr`) is a framework for building federated learning systems. The design of Flower is based on a few guiding principles: * **Customizable**: Federated learning systems vary wildly from one use case to another. Flower allows for a wide range of different configurations depending on the needs of each individual use case. * **Extendable**: Flower originated from a research project at the Univerity of Oxford, so it was build with AI research in mind. Many components can be extended and overridden to build new state-of-the-art systems. * **Framework-agnostic**: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, [PyTorch](https://pytorch.org), [TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [MXNet](https://mxnet.apache.org/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), or even raw [NumPy](https://numpy.org/) for users who enjoy computing gradients by hand. * **Understandable**: Flower is written with maintainability in mind. The community is encouraged to both read and contribute to the codebase. Meet the Flower community on [flower.dev](https://flower.dev)! ## Documentation [Flower Docs](https://flower.dev/docs): * [Installation](https://flower.dev/docs/installation.html) * [Quickstart (TensorFlow)](https://flower.dev/docs/quickstart_tensorflow.html) * [Quickstart (PyTorch)](https://flower.dev/docs/quickstart_pytorch.html) * [Quickstart (Hugging Face [code example])](https://flower.dev/docs/quickstart_huggingface.html) * [Quickstart (PyTorch Lightning [code example])](https://flower.dev/docs/quickstart_pytorch_lightning.html) * [Quickstart (MXNet)](https://flower.dev/docs/example-mxnet-walk-through.html) * [Quickstart (JAX [code example])](https://github.com/adap/flower/tree/main/examples/jax_from_centralized_to_federated) * [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist) * [Quickstart (TFLite on Android [code example])](https://github.com/adap/flower/tree/main/examples/android) ## Flower Baselines Flower Baselines is a collection of community-contributed experiments that reproduce the experiments performed in popular federated learning publications. Researchers can build on Flower Baselines to quickly evaluate new ideas: * [FedBN: Federated Learning on non-IID Features via Local Batch Normalization](https://arxiv.org/pdf/2102.07623.pdf): * [Convergence Rate](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/publications/fedbn/convergence_rate) * [Adaptive Federated Optimization](https://arxiv.org/pdf/2003.00295.pdf) * [CIFAR-10/100](https://github.com/adap/flower/tree/main/baselines/flwr_baselines/publications/adaptive_federated_optimization) Check the Flower documentation to learn more: [Using Baselines](https://flower.dev/docs/using-baselines.html) The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline: [Contributing Baselines](https://flower.dev/docs/contributing-baselines.html) ## Flower Usage Examples A number of examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow). To run an example, first install the necessary extras: [Usage Examples Documentation](https://flower.dev/docs/examples.html) Quickstart examples: * [Quickstart (TensorFlow)](https://github.com/adap/flower/tree/main/examples/quickstart_tensorflow) * [Quickstart (PyTorch)](https://github.com/adap/flower/tree/main/examples/quickstart_pytorch) * [Quickstart (Hugging Face)](https://github.com/adap/flower/tree/main/examples/quickstart_huggingface) * [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart_pytorch_lightning) * [Quickstart (MXNet)](https://github.com/adap/flower/tree/main/examples/quickstart_mxnet) * [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist) * [Quickstart (TFLite on Android)](https://github.com/adap/flower/tree/main/examples/android) Other [examples](https://github.com/adap/flower/tree/main/examples): * [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded_devices) * [Android & TFLite](https://github.com/adap/flower/tree/main/examples/android) * [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch_from_centralized_to_federated) * [MXNet: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/mxnet_from_centralized_to_federated) * [JAX: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/jax_from_centralized_to_federated) * [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced_tensorflow) * [Single-Machine Simulation of Federated Learning Systems](https://github.com/adap/flower/tree/main/examples/simulation) ## Community Flower is built by a wonderful community of researchers and engineers. [Join Slack](https://flower.dev/join-slack) to meet them, [contributions](#contributing-to-flower) are welcome. ## Citation If you publish work that uses Flower, please cite Flower as follows: ```bibtex @article{beutel2020flower, title={Flower: A Friendly Federated Learning Research Framework}, author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Parcollet, Titouan and Lane, Nicholas D}, journal={arXiv preprint arXiv:2007.14390}, year={2020} } ``` Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request. ## Contributing to Flower We welcome contributions. Please see [CONTRIBUTING.md](CONTRIBUTING.md) to get started!