February 23–26, 2020
Please log in
Please log in

Using actors for data-driven stream processing

Amanda Kabak (CleanSpark)
4:50pm5:40pm Wednesday, February 26, 2020
Location: Sutton South
Secondary topics:  Case Study
Average rating: ****.
(4.25, 4 ratings)

Who is this presentation for?

  • Architects and senior software engineers

Level

Intermediate

Description

There are many methods available for processing streaming data, specifically for the IoT space, but every site or project you need to support may require a processing pipeline unique to its configuration. Yes, Amanda Kabak explains, serverless functions are cool, but wiring up sequences of them by hand or even coding multiple steps into a single function’s execution isn’t a sustainable answer to this problem, and processing thousands of incoming data points a minute can lead to large operating costs.

Enter actors. Actors are small bits of code that execute in a single-threaded way that can be combined with an internal state that only it can mutate. Within an actor platform, actors provide highly scalable processing power, but the true value of actors in stream processing comes not from each actor individually but from how they can be networked together. Small actors that each execute a very simple and testable calculation can be combined into a data-driven calculation mesh that can provide high-level derivations in a thread-safe way for huge amounts of streaming data.

Imagine, for example, an industrial IoT monitoring solution that can be deployed across sites with different equipment configurations. The only things executives want to know are total downtime in their plant and total cost savings, but those are calculated and aggregated differently based on the type and number of equipment monitored at each site. With a calculation mesh running across networked actors, small, exhaustively tested calculation steps can be combined in unique ways using metadata-based configuration rather than code customization to provide valuable insights from streaming data.

You’ll discover actors conceptually and explore a case study for this approach running in Microsoft’s Service Fabric for a platform monitoring and controlling renewable energy assets.

Prerequisite knowledge

  • A basic understanding of stream processing

What you'll learn

  • Discover actors and the actor pattern as a concept
  • Identify a different approach to stream processing
Photo of Amanda Kabak

Amanda Kabak

CleanSpark

Amanda Kabak is the CTO and principal architect at CleanSpark, a startup focused on monitoring, control, and optimization of electric microgrids. She’s been architecting and developing cloud native, industrial IoT solutions for the last six years. For a decade and a half previously, she developed enterprise-level line-of-business applications.

  • IBM
  • LaunchDarkly
  • LightStep
  • Red Hat
  • ThoughtWorks
  • Auth0
  • Check Point Software
  • Contentful
  • Contrast Security
  • Datadog
  • Diamanti
  • Octobot.io
  • Optimizely
  • Perforce
  • Robin.io
  • SmartBear
  • Tidelift
  • WhiteSource
  • Synopsys
  • AxonIQ
  • Codefresh
  • CodeStream
  • Hello2morrow
  • LogRocket
  • Rookout
  • Solo.io
  • CNN
  • Boundless Notions, LLC

Contact us

confreg@oreilly.com

For conference registration information and customer service

partners@oreilly.com

For more information on community discounts and trade opportunities with O’Reilly conferences

Become a sponsor

For information on exhibiting or sponsoring a conference

pr@oreilly.com

For media/analyst press inquires