Each time we use a connected product or service, we generate a tsunami of personal data. It’s difficult to grasp the scale of our individual data output because it’s so vast—the equivalent of around 27 million tweets every day.
Companies aggressively collect the data we generate because it fuels their businesses, whether by processing that information and selling it to advertisers or tailoring their own products to our specific needs. Our data has become a valuable commodity, but while it is used in thousands of ways, we’re usually only asked permission at one moment and in one way: by being shown an impenetrable list of terms and conditions and being asked to click “agree.” Once we’ve done that, we don’t have any say over what happens next.
That’s not good enough, and users know it. The Norwegian Consumer Council’s live reading of the terms and conditions of an average mobile app captured attention around the world, and the decision by WhatsApp to change its terms and conditions so it could share users’ data with Facebook drew collective criticism.
It’s clear this is a knotty ethical problem, but what’s less obvious is that this is really a design challenge. As digital products reach more parts of our lives, we need new patterns for asking consent to empower people and restore their trust.
IF has been compiling a catalogue of patterns for asking for people’s permission to access their data. Some of them occur frequently—bleak anti-patterns that harvest personal information—but others offer a brighter future, one in which people have a stake in how their data is used and a say in what happens next.
Sarah Gold outlines the kind of problems that come up when you take a new approach to data, discusses projects that offer new models of consent, and explores how people respond when you give them more agency over the data they generate.
Sarah founded IF to work on ambitious projects involving technology, data, and networks in the public domain. A NESTA New Radical, Forbes 30 under 30 awardee, Tech For Good advisor and part of the practitioner panel for the Research Institute in the Science of Cyber Security, she sees the way we think about security and privacy not as matters of compliance but challenges for design.
©2017, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com