With the advent of connected devices with computation and storage capabilities, it’s now possible to run machine learning workflows entirely on-device. Federated learning is an approach for training ML models across a fleet of participating devices without collecting their data in a central location. Federated learning improves upon the traditional, fully centralized approaches by reducing the costs and risks related to sensitive data handling, working better in bandwidth and power-constrained environments, and providing a straightforward, effective mechanism for personalization at scale. It also puts users back in control of their data while still enabling developers to build intelligent applications that leverage insights from that data.
Alex Ingerman offers an overview of federated learning, compares traditional and federated ML workflows, and explores the current and upcoming use cases for decentralized machine learning, with examples from Google’s deployment of this technology. Along the way, you’ll also explore secure aggregation: using cryptography to simulate a trusted aggregator, adding another layer of security and privacy protections in the federated learning environment.
Alex Ingerman leads the product management team at Google Research, focusing on federated learning and other privacy-preserving technologies for machine learning. He joined Google in 2016 after working on products including ML-as-a-service platform for developers, web-scale search, content recommendation system, and immersive data-exploration environments. Alex holds a BS in computer science and an MS in medical engineering.
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • firstname.lastname@example.org