11:00am–11:40am Wednesday, 03/30/2016
Yvonne Quacken and Allen Hoem explore the business and technical challenges that Siemens faced capturing continuous data from millions of sensors across different areas and explain how Teradata Listener helped Siemens simplify this data-capture process with a single, central service to ingest multiple real-time data streams simultaneously in a reliable fashion.
2:40pm–3:20pm Thursday, 03/31/2016
Oil and gas organizations are at the forefront of big data, adopting technologies such as Hadoop and Spark to develop next-generation fusion systems. Brian Clark and Marco Ippolito introduce a case study from CGG, a builder of common data models to drive analytics of sensor data and associated metadata from fast-changing big data streams, to show how to derive richer value from big data assets.
2:40pm–3:20pm Wednesday, 03/30/2016
Modern houses and robots have a lot in common. Both have a lot of sensors and have to make a lot of decisions. However, unlike houses, robots adapt and perform helpful tasks. Brandon Rohrer details an algorithm specifically designed to help houses, buildings, roads, and stores learn to actively help the people that use them.
5:10pm–5:50pm Wednesday, 03/30/2016
In the current explosion of the Internet of Things, big data, and mobile, compliance often takes a back seat. But the failure to address legal privacy and consumer-protection considerations has landed many in hot water, resulting in potential legal settlements and business failures. Alysa Hutnik and Kristi Wolff discuss flash points and proactive strategies to avoid becoming a target.
11:50am–12:30pm Thursday, 03/31/2016
Over the past decade, machine learning has become intertwined with newer, Internet-born businesses. This despite the fact that the vast majority of global GDP turns on larger, less visible industries like energy and construction. David Beyer explores the ways these backbone industries are adopting machine-intelligent applications and the trends underlying this shift.
11:50am–12:30pm Thursday, 03/31/2016
High-velocity, high-volume, and high-variety data streams challenge analytics organizations because the ability to get critical insights often decays rapidly. Pat McGarry explains how organizations that embrace heterogeneous computing techniques can overcome hurdles to real-time insights, thereby gaining significant competitive advantages.
11:50am–12:30pm Wednesday, 03/30/2016
The Internet of Things (IoT) continues to provide value and hold promise for both the consumer and enterprise alike. To succeed, an IoT project must concern itself with how to ingest data, build actionable models, and react in real time. Chris Rawles describes approaches to addressing these concerns through a deep dive into an interactive demo centered around classification of human activities.
9:40am–10:00am Tuesday, 03/29/2016
The Quantified Self movement continues to grow apace. Trina Chiasson explains what we can learn from these hobbyist data hackers about how to make data fun, personally relevant, and actionable.
1:50pm–2:30pm Wednesday, 03/30/2016
In the race to pair streaming systems with stateful systems, the winners will be stateful systems that process streams natively. These systems remove the burden on application developers to be distributed systems experts and enable new applications to be both powerful and robust. John Hugg describes what’s possible when integrated systems apply a transactional approach to event processing.
1:50pm–2:30pm Thursday, 03/31/2016
Heron, Twitter's streaming system, has been in production nearly two years and is widely used by several teams for diverse use cases. Karthik Ramasamy discusses Twitter's operating experiences and shares the challenges of running Heron at scale as well as the approaches that Twitter took to solve them.