Machine learning applied to health care or otherwise sensitive data holds huge potential, but may be blocked if privacy is not adequately addressed. We review modern cryptographic techniques such as homomorphic encryption and multi-party computation to understand how they enable privacy-preserving machine learning. We then go more in-depth with concrete examples in TensorFlow, using the open source library tf-encrypted, and show how predictions can be done without exposing the prediction input, and how a model can be fitted without ever exposing the training data. Finally, we show how data scientists can get started with privacy-preserving techniques today, without needing to master cryptography.
Morten is co-founder and research scientist at Dropout Labs, a startup building a platform for secure, privacy-preserving machine learning to manage the sensitive, competitive, and regulatory nature of data. He is also lead developer on tf-encrypted, an open source project for integrating and experimenting with privacy-preserving machine learning directly in TensorFlow. With a background in cryptography and privacy, Morten has spent recent years applying and adapting techniques from these fields to machine learning, focusing on practical tools and concrete applications in hope of making the field more accessible to practitioners.
Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?
Join the conversation here (requires login)
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com