Presented By O'Reilly and Cloudera
Make Data Work
December 1–3, 2015 • Singapore

When it absolutely, positively, has to be there: Reliability guarantees in Kafka

Gwen Shapira (Confluent)
2:20pm–3:00pm Wednesday, 12/02/2015
Hadoop & Beyond
Location: 328-329 Level: Intermediate
Average rating: ***..
(3.75, 4 ratings)

Prerequisite Knowledge

Basic knowledge of Hadoop.

Description

In the financial industry, losing data is unacceptable. Financial firms are adopting Kafka for their critical applications. Kafka provides the low latency, high throughput, high availability, and scale that these applications require. But can it also provide complete reliability? As a system architect, when asked: “Can you guarantee that we will always get every transaction,” you want to be able to say “Yes” with total confidence.

In this session, we will go over everything that happens to a message – from producer to consumer – and pinpoint all the places where data can be lost if you are not careful. You will learn how developers and operations teams can work together to build a bulletproof data pipeline with Kafka. And if you need proof that you built a reliable system – we’ll show you how you can build the system to prove this too.

Photo of Gwen Shapira

Gwen Shapira

Confluent

Gwen Shapira is a system architect at Confluent, where she helps customers achieve success with their Apache Kafka implementations. She has 15 years of experience working with code and customers to build scalable data architectures, integrating relational and big data technologies. Gwen currently specializes in building real-time reliable data-processing pipelines using Apache Kafka. Gwen is an Oracle Ace Director, the coauthor of Hadoop Application Architectures, and a frequent presenter at industry conferences. She is also a committer on Apache Kafka and Apache Sqoop. When Gwen isn’t coding or building data pipelines, you can find her pedaling her bike, exploring the roads and trails of California and beyond.