Build resilient systems at scale
October 12–14, 2015 • New York, NY

Is the internet reliable yet?

Robert Peters (Verizon Digital Media Services )
10:10am–10:20am Tuesday, 10/13/2015
Keynote
Location: Grand Ballroom
Average rating: ***..
(3.87, 38 ratings)
Slides:   1-PPTX 

The internet and the web are well on their way to becoming foundational and ubiquitous technologies throughout the world. People want to see these in the same category as other technologies that we take for granted and expect to “just work,” like electricity to our homes, clean, running water, or our cars. But it’s hard to say that the internet “just works” – for instance, unbiased third-party measures of real-user data show that even the highest-performing global content delivery networks (CDNs) and cloud-delivery services have at best ~99% availability rates for simple single-object downloads. That puts the internet’s reliability more on par with spaceflight than, for example, automobiles. We need to do better if we want to use the web to form basic building blocks of larger systems.

In this talk, we’ll look at internet reliability in terms of the failures that can occur across a wide range of timescales and impact levels – from the once-a-millisecond scale of the barely-noticed microfailures that happen when a packet is lost, to the once-a-year scale of the major outages of a cloud or infrastructure service provider, or internet backbone, that cause online panic and lead to trending topics. We’ll use examples to review and illustrate two general approaches to improving reliability:

  1. Reducing failure rates across all time/impact scales, with very different techniques being appropriate at each scale
  2. Reducing failure impact: de-escalating high-impact failures into lower-impact ones

Robert Peters

Verizon Digital Media Services

Rob Peters is CTO for Verizon Digital Media Services. He was one of the early members of the engineering team at VDMS (then EdgeCast), and spent many years working with the company’s core http cache/proxy engine and its surrounding infrastructure ecosystem of software and hardware. He enjoys deliberating about deployment methodologies, looking at graphs and grids full of colorful numbers, and finding analogies between biological systems and technology.

Rob completed a Ph.D. in Computation and Neural Systems at the California Institute of Technology, and spent several years in post-doctoral work at the University of Southern California building biologically-inspired computer vision software. You can find Rob online at @rjpcal .

Stay Connected

Follow Velocity on Twitter Facebook Group Google+ LinkedIn Group

Videos

More Videos »

O’Reilly Media

Tech insight, analysis, and research