Presented By O'Reilly and Cloudera
Make Data Work
22–23 May 2017: Training
23–25 May 2017: Tutorials & Conference
London, UK

Distributed deep learning on AWS using Apache MXNet

Anima Anandkumar (UC Irvine)
13:3017:00 Tuesday, 23 May 2017
Data science and advanced analytics
Location: Capital Suite 13
Secondary topics:  Cloud, Deep learning
Level: Advanced
Average rating: ***..
(3.67, 3 ratings)

Who is this presentation for?

  • Data scientists, software engineers, and machine-learning researchers

Prerequisite knowledge

  • Familiarity with Python and the Jupyter Notebook
  • Experience with machine learning (useful but not required)

Materials or downloads needed in advance

  • A laptop
  • An AWS account (useful but not required)

What you'll learn

  • Explore deep learning fundamentals
  • Learn how to run deep learning modules and develop new code on MXNet and deploy on AWS

Description

Apache MXNet, a next-generation deep learning framework designed for both efficiency and flexibility, allows you to mix the flavors of symbolic programming and imperative programming to maximize efficiency and productivity. At its core, it’s a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly with a graph optimization layer on top that makes symbolic execution fast and memory efficient. The library is portable and lightweight, and it is suitable for deployment from multiple GPUs and multiple machines to embedded systems such as smartphones and embedded GPUs.

Anima Anandkumar provides hands-on experience on how to use Apache MXNet with preconfigured Deep Learning AMIs and CloudFormation Templates to help speed your development.

Topics include:

  • Background on deep learning
  • A walk-through on setting up AMIs, CloudFormation Templates, and other deep learning frameworks on AWS
  • A peek under the Apache MXNet hood (Apache MXNet internals) and a comparison with other deep learning frameworks
  • Hands on with Apache MXNet: NDArrays, symbols, and the mechanics of training deep neural networks
  • Hands on with Apache MXNet: Application examples targeting computer vision and recommendation engines
  • Live demo that trains ImageNet with multiple AWS GPU instances and meets the state-of-the-art accuracy within a single hour
Photo of Anima Anandkumar

Anima Anandkumar

UC Irvine

Anima Anandkumar is a principal scientist at Amazon Web Services. Anima is currently on leave from UC Irvine, where she is an associate professor. Her research interests are in the areas of large-scale machine learning, nonconvex optimization, and high-dimensional statistics. In particular, she has been spearheading the development and analysis of tensor algorithms. Previously, she was a postdoctoral researcher at MIT and a visiting researcher at Microsoft Research New England. Anima is the recipient of several awards, including the Alfred. P. Sloan fellowship, the Microsoft faculty fellowship, the Google research award, the ARO and AFOSR Young Investigator awards, the NSF CAREER Award, the Early Career Excellence in Research Award at UCI, the Best Thesis Award from the ACM SIGMETRICS society, the IBM Fran Allen PhD fellowship, and several best paper awards. She has been featured in a number of forums, such as the Quora ML session, Huffington Post, Forbes, and O’Reilly Media. Anima holds a BTech in electrical engineering from IIT Madras and a PhD from Cornell University.

Leave a Comment or Question

Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?

Join the conversation here (requires login)