Presented By O’Reilly and Intel Nervana
Put AI to work
September 17-18, 2017: Training
September 18-20, 2017: Tutorials & Conference
San Francisco, CA

How to escape saddle points efficiently

Michael Jordan (UC Berkeley)
8:55am–9:10am Wednesday, September 20, 2017
Location: Grand Ballroom B
Average rating: ****.
(4.60, 10 ratings)

Many new theoretical challenges have arisen in the area of gradient-based optimization for large-scale data analysis, driven by the needs of applications and the opportunities provided by new hardware and software platforms. Drawing on work undertaken with Chi Jin, Rong Ge, Praneeth Netrapalli, and Sham Kakade, Michael Jordan shares recent research on the avoidance of saddle points in high-dimensional nonconvex optimization.

Photo of Michael Jordan

Michael Jordan

UC Berkeley

Michael I. Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley. His research interests bridge the computational, statistical, cognitive, and biological sciences; in recent years, he has focused on Bayesian nonparametric analysis, probabilistic graphical models, spectral methods, kernel machines, and applications to problems in distributed computing systems, natural language processing, signal processing, and statistical genetics. Previously, he was a professor at MIT. Michael is a member of the National Academy of Sciences, the National Academy of Engineering, and the American Academy of Arts and Sciences and a fellow of the American Association for the Advancement of Science, the AAAI, ACM, ASA, CSS, IEEE, IMS, ISBA, and SIAM. He has been named a Neyman Lecturer and a Medallion Lecturer by the Institute of Mathematical Statistics. He received the David E. Rumelhart Prize in 2015 and the ACM/AAAI Allen Newell Award in 2009. Michael holds a master’s degree in mathematics from Arizona State University and a PhD in cognitive science from the University of California, San Diego.