Engineer for the future of Cloud
June 10-13, 2019
San Jose, CA

Ghost in the machine: The unintended consequences of bias in machine learning

Nivia Henry (Spotify)
3:50pm4:30pm Thursday, June 13, 2019
Emerging Tech
Location: LL21 E/F
Average rating: *****
(5.00, 6 ratings)



Prerequisite knowledge

  • A basic understanding of software development

What you'll learn

  • Learn how to reduce or eliminate biases in machine learning


From recommendations engines to deep learning algorithms to help detect cancer, machine learning is quickly becoming a normal part of our lives. Which means more possibilities of encountering algorithms gone awry—such as the infamous Tay the racist Twitter Bot. Every step of building an algorithm introduces risks for injecting unintended bias; yet we rarely focus on ways to mitigate or eliminate these. Instead, we accept these as the cost of operating in this space.

Nivia Henry walks you through the journey of how an algorithm is built; risk points for errors including algorithmic bias; and steps you can take to catch, reduce, or eliminate them before they cause harm to your users.

Photo of Nivia Henry

Nivia Henry


Nivia S. Henry fundamentally believes that happy people, working in a healthy environment, will produce great outcomes. This is the philosophy behind her 15-plus-year career creating structures in which high-performing teams thrive. Today, Nivia plies her trade as an a manager of engineering managers at Spotify. Her career path has included nearly every role in tech, but her true passion is inspiring people to do their best work. Nivia has cochaired one of the largest tracks for Agile Alliance, organized meetups, and has spoken at conferences of all sizes. Her hobbies include being an overbearing mom to a gorgeous cat and traveling with her awesome husband, Andre. You can find her on Twitter and LinkedIn.