Presented By O’Reilly and Cloudera
Make Data Work
September 11, 2018: Training & Tutorials
September 12–13, 2018: Keynotes & Sessions
New York, NY

Perverse incentives in metrics: Inequality in the like economy

Bonnie Barrilleaux (LinkedIn)
2:55pm–3:35pm Wednesday, 09/12/2018
Data science and machine learning
Location: 1A 06/07 Level: Intermediate
Secondary topics:  Ethics and Privacy, Media, Marketing, Advertising, Recommendation Systems
Average rating: ****.
(4.50, 4 ratings)

Who is this presentation for?

  • Analysts, data scientists, product managers, ranking and relevance engineers, and optimization enthusiasts

Prerequisite knowledge

  • A basic understanding of A/B testing and metrics (useful but not required)

What you'll learn

  • Understand the problem of perverse incentives in metrics
  • Learn how to identify and combat these negative side effects


Analytics data scientists at LinkedIn are responsible for creating metrics to evaluate product success. They’re very good at increasing any metric the company decides to optimize, but following a single metric blindly can lead to problems. For instance, as LinkedIn encouraged members to join conversations, it found itself in danger of creating a “rich get richer” economy in which a few creators got an increasing share of all feedback. Highly skewed distribution of feedback occurs naturally in any system that distributes content virally, but that doesn’t mean it’s good for creators. As a result, LinkedIn implemented changes to balance out this increasing skew and added creator-focused metrics to monitor how well the ecosystem is distributing feedback to creators, which gives the company a more balanced view of the ecosystem.

Drawing on this example, Bonnie Barrilleaux explains why you must regularly reevaluate metrics to avoid perverse incentives—situations where efforts to increase the metric cause unintended negative side effects. Along the way, Bonnie explores a few illustrative historical examples to demonstrate that this problem has been common throughout human endeavors:

  • Rat control in Hanoi: Instead of leading to effective control of the rat population as intended, paying locals by the rat tail actually led to rat farming and a worsening rat population problem.
  • “Hoy no circulada” days in Mexico City: Instead of reducing air pollution and boosting public transit ridership as intended, banning certain cars from driving on certain days of the week actually led to increased air pollution, as families bought second cars and kept older higher-polluting cars longer to get around the ban.

These examples remind us that metrics are always an imperfect proxy for reality. In the end, metrics are just a tool for humans to use. It’s important to ensure that the metrics incentivize teams to create real value for users, and it’s critical to regularly evaluate whether the metrics are still effectively driving you in the direction you really want to head. Bonnie shares key defensive strategies to help you do so, by:

  • Building a data-informed culture, in which data informs, but humans are always driving
  • Creating awareness of the problem so everyone is trained in how to spot it
  • Monitoring a balanced suite of metrics to detect and mitigate perverse incentives as they arise
  • Reevaluating metrics at regular intervals; if they’re not working, change them
Photo of Bonnie Barrilleaux

Bonnie Barrilleaux


Bonnie Barrilleaux is a staff data scientist in analytics at LinkedIn, primarily focused on communities and the content ecosystem. She uses data to guide product strategy, performs experiments to understand the ecosystem, and creates metrics to evaluate product performance. Previously, she completed a postdoctoral fellowship in genomics at the University of California, Davis, studying the function of the MYC gene in cancer and stem cells. Bonnie has published peer-reviewed works including 11 journal articles, a book chapter, and a video article and has been awarded multiple grants to create interactive art. She holds a PhD in chemical engineering from Tulane University.