Presented By O'Reilly and Cloudera
Make Data Work
December 1–3, 2015 • Singapore

How are your morals? Ethics in algorithms and IoT

Majken Sander (Majken Sander), Joerg Blumtritt (Datarella)
4:50pm–5:30pm Thursday, 12/03/2015
IoT and Real-time
Location: 324 Level: Intermediate
Average rating: ****.
(4.00, 4 ratings)

Prerequisite Knowledge

Interest in usage of data, having a critical view of public data and/or privacy.


The codes that make things into smart things are not objective. The algorithms bear value judgments, decisions on methods, or pre-sets of the program’s parameters — choices made on how to deal with tasks according to social, cultural, or legal rules or personal persuasion.

Obvious examples of “ethics codes” are credit scoring or differentiated pricing of a retail offer. However, there are a multitude of “hidden” ethics algorithms that are far more pervasive. When an ad network’s targeting system selects which ads we see and which not, we might not find that too important. But when a search engine or news feed is deciding what it regards as relevant information to show us and what not to, and by that shaping our view of the world without us knowing, it becomes far more important. And the realization that self-driving cars will have to act according to some algorithm when a collision might be inevitable, and this might lead to injuries or even people getting killed, the question of ethics in algorithms becomes highly relevant.

It raises important questions about the transparency of these algorithms including our ability to, or just as important, our lack of ways, to change or affect the way an algorithm views us.

We need to address the end users who need higher awareness, more education, and insight regarding those subjective algorithms that affect our lives. We also need to look at ourselves, data consumers, data analysts, and developers, who more or less knowingly produce subjective answers by our choice of methods and parameters — unaware of the bias we impose on a product, a company, and its users.

Sometimes the only way to understand these presumptions is to “open the black box” — hence to hack. Or support the process of using data with ‘label of content’ and the use of design patterns as guidelines.

We will present some of these value judgments with examples and discuss their consequences. We will also present possible ways to resolve the problem: algorithm audits and standardized specifications, but also more visionary concepts like a “data ethics oath,” “algorithm angels,” and ethics design patterns that could guide developers in building their smart things.

Photo of Majken Sander

Majken Sander

Majken Sander

Majken Sander is a data nerd, business analyst, and solution architect. Previously, Majken worked with IT, management information, analytics, BI, and DW for 20+ years. Armed with strong analytical expertise, she’s keen on “data driven” as a business principle, data science, the IoT, and all other things data. Read more

Photo of Joerg Blumtritt

Joerg Blumtritt


Joerg Blumtritt is the founder and CEO of Datarella, a computational social science startup delivering mobile analytics, self-tracking solutions, and data science consulting. After graduating from university with a thesis on machine learning, Joerg worked as a researcher in behavioral sciences, focused on nonverbal communication. His projects have been funded by an EU commission, the German federal government, and the Max Planck Society. He subsequently ran marketing and research teams for TV networks ProSiebenSat.1 and RTL II and magazine publisher Hubert Burda Media. As European operations officer at Tremor Media, Joerg was in charge of building the New York-based video advertising network’s European enterprises. More recently, he was managing director of MediaCom Germany. Joerg is the founder and chairman of the German Social Media Association (AG Social Media) and the coauthor of the Slow Media Manifesto. Joerg blogs about big data and the future of social research at and about the Quantified Self at