Sep 9–12, 2019

Distributed deep learning via containerized decoupled neural interfaces

Michael Bauer (Sylabs, Inc.)
11:55am12:35pm Wednesday, September 11, 2019
Location: LL21 C/D

Who is this presentation for?

  • Data scientists, machine learning engineers, HPC programmers, and distributed systems engineers

Level

Intermediate

Description

The traditional approach for training neural networks necessitates both forward and backward locking during the feed-forward and backpropagation stages, respectively. A side effect of this locking is a network that cannot be trained in a distributed manner, thus limiting the parallelism of the training phase to batch-based parallelism. In their groundbreaking 2016 paper “Decoupled Neural Interfaces Using Synthetic Gradients,” Jaderberg et al. propose an architecture in which each layer (or set of layers) in a neural network can be trained asynchronously in a decoupled manner as an individual module, eliminating both forward and backward locking from the training stage. In other words, their DNI scheme results in a network architecture that can exploit distributed parallelism at scale—scaling that becomes increasingly important as the complexity of the network increases.

Michael Bauer explores a novel application of DNIs to the construction of highly distributed and scalable neural networks using the compute-optimized container runtime Singularity. By taking advantage of uniquely integrated features inherent in Singularity (such as native GPU, Infiniband, and MPI support), it’s possible to build a container that functions as a DNI module, abstracting away the implementation details of the DNI. The resulting container module can then be replicated and scheduled across any compute resource to create a distributed, decoupled, and asynchronously trained neural network of arbitrarily large scale and complexity.

Prerequisite knowledge

  • A basic understanding of the fundamentals of neural networks (feed-forward, layers, backpropagation, activations, error functions, gradient descent, etc.)

What you'll learn

  • Understand how containerization technology and DNIs can abstract neural network details and build complex networks on large compute resources
Photo of Michael Bauer

Michael Bauer

Sylabs, Inc.

Michael Bauer is a senior software engineer at Sylabs, who’s an expert in Linux container technologies. At Sylabs, he’s the lead engineer of the core services team, providing technical oversight and direction over products such as Singularity, SingularityPRO, and various Kubernetes integrations. Michael has been involved with the Singularity open source project for almost three years, first as a contributor and now as a project lead and maintainer. He’s given talks about Singularity and Linux containers around the world at conferences such as ISC, SC, FOSDEM, and many others. Recently, he’s been exploring novel approaches to machine learning via container technology.

Leave a Comment or Question

Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?

Join the conversation here (requires login)

Contact us

confreg@oreilly.com

For conference registration information and customer service

partners@oreilly.com

For more information on community discounts and trade opportunities with O’Reilly conferences

Become a sponsor

For information on exhibiting or sponsoring a conference

Contact list

View a complete list of O'Reilly AI contacts