Prepare to Design the Future
March 19–20, 2017: Training
March 20–22, 2017: Tutorials & Conference
San Francisco, CA

Design patterns for data sharing

Sarah Gold (Projects by IF)
1:15pm–1:55pm Tuesday, March 21, 2017
Beyond the screen
Location: California West
Level: Beginner
Average rating: *****
(5.00, 2 ratings)

Who is this presentation for?

  • Designers, developers, software engineers, and product owners

What you'll learn

  • Understand why seeking consent is a design problem and why using a "terms and conditions" design pattern isn’t the only way to solve it
  • Learn to solve this problem by understanding the needs of your users and using different technological approaches

Description

Each time we use a connected product or service, we generate a tsunami of personal data. It’s difficult to grasp the scale of our individual data output because it’s so vast—the equivalent of around 27 million tweets every day.

Companies aggressively collect the data we generate because it fuels their businesses, whether by processing that information and selling it to advertisers or tailoring their own products to our specific needs. Our data has become a valuable commodity, but while it is used in thousands of ways, we’re usually only asked permission at one moment and in one way: by being shown an impenetrable list of terms and conditions and being asked to click “agree.” Once we’ve done that, we don’t have any say over what happens next.

That’s not good enough, and users know it. The Norwegian Consumer Council’s live reading of the terms and conditions of an average mobile app captured attention around the world, and the decision by WhatsApp to change its terms and conditions so it could share users’ data with Facebook drew collective criticism.

It’s clear this is a knotty ethical problem, but what’s less obvious is that this is really a design challenge. As digital products reach more parts of our lives, we need new patterns for asking consent to empower people and restore their trust.

IF has been compiling a catalogue of patterns for asking for people’s permission to access their data. Some of them occur frequently—bleak anti-patterns that harvest personal information—but others offer a brighter future, one in which people have a stake in how their data is used and a say in what happens next.

Sarah Gold outlines the kind of problems that come up when you take a new approach to data, discusses projects that offer new models of consent, and explores how people respond when you give them more agency over the data they generate.

Photo of Sarah Gold

Sarah Gold

Projects by IF

Sarah founded IF to work on ambitious projects involving technology, data, and networks in the public domain. A NESTA New Radical, Forbes 30 under 30 awardee, Tech For Good advisor and part of the practitioner panel for the Research Institute in the Science of Cyber Security, she sees the way we think about security and privacy not as matters of compliance but challenges for design.