Presented By O’Reilly and Intel AI
Put AI to work
Sep 4-5, 2018: Training
Sep 5-7, 2018: Tutorials & Conference
San Francisco, CA
Vikram Saletore

Vikram Saletore
Principal Engineer and Performance Architect, Intel

Website

Vikram Saletore is a principal engineer and a performance architect in the Customer Solutions, Artificial Intelligence Products, and Data Center Groups at Intel. Vikram leads performance optimizations for distributed machine learning (ML) and deep learning (DL) workloads and collaborates with industry enterprise and government partners, OEMs, HPC, and CSP customers on deep learning scale-out training and inference and machine learning analytics on Intel architectures. Vikram is also a technical coprincipal investigator for distributed deep learning research with European members of Intel’s Parallel Computing Center. Vikram has 25+ years of experience and has led many data center initiatives. As a research scientist with Intel Labs, he led research collaboration with HP Labs. Prior to Intel, Vikram was a tenure-track faculty member in the Computer Science Department at Oregon State University and led NSF-funded research in parallel programming and distributed computing, supervising eight graduate students. He also worked at DEC and AMD. He has many patents and has authored ~45 peer-reviewed research publications. Vikram holds a PhD in EE with a focus on parallel programming and distributed computing from the University of Illinois Urbana-Champaign and an MS from UC Berkeley.

Sessions

11:05am-11:45am Friday, September 7, 2018
Location: Continental 7-9
Tags: intel
Vikram Saletore (Intel), Lucas Wilson (Dell EMC)
Average rating: *****
(5.00, 1 rating)
Vikram Saletore and Luke Wilson discuss a collaboration between SURFSara and Intel to advance the state of large-scale neural network training on Intel Xeon CPU-based servers, highlighting improved time to solution on extended training of pretrained models and exploring how various storage and interconnect options lead to more efficient scaling. Read more.