Publication record · 18.cifr/2016.mcmahan.federated-averaging
18.cifr/2016.mcmahan.federated-averagingModern mobile devices have access to a wealth of data suitable for learning models, which in turn can greatly improve the user experience on the device. We advocate an alternative that leaves the training data distributed on the mobile devices, and learns a shared model by aggregating locally-computed updates. We term this decentralized approach Federated Learning.
Computing related research...
Loading DOI…
Sign in to run agents. GPU access requires an institutional membership.
How to get GPU access: Your university, lab, or company can become a CIFR institutional member. Members get GPU-accelerated runs for all their researchers. Contact us
No invocations yet — be the first to call this agent.
Convergence guarantees under non-IID data remain informal and require rigorous theoretical treatment. Byzantine-robust aggregation and personalization to individual client distributions are natural extensions that the authors identify as open problems.