Online learning techniques, such as Stochastic Gradient Descent (SGD) and Contrastive Divergence, have proven quite useful in applied machine learning. However, their sequential design prevents them from taking advantage of newer distributed frameworks such as Hadoop / MapReduce. In this session, we will take a look at how we parallelize Deep Belief Networks in Deep Learning on the nextgen YARN framework Iterative Reduce and the parallel machine learning library Metronome. We’ll also take a look at some real world applications of Deep Learning on Hadoop such as image classification and NLP.
Josh Patterson currently runs a consultancy in the Big Data Machine Learning space. Previously Josh worked as a Principal Solutions Architect at Cloudera and an engineer at the Tennessee Valley Authority where he was responsible for bringing Hadoop into the smartgrid during his involvement in the openPDC project. Josh is a graduate of the University of Tennessee at Chattanooga with a Masters of Computer Science where he did research in mesh networks and social insect swarm algorithms. Josh has over 15 years in software development and continues to contribute to projects such as Apache Mahout, Metronome, IterativeReduce, openPDC, and JMotif.
Adam Gibson is a deep learning specialist based in San Francisco assisting Fortune 500 companies, hedge funds, PR firms and startup accelerators with their machine learning projects. Adam has a strong track record helping companies handle and interpret big real-time data. Adam has been a computer nerd since he was 13 and actively contributes to the open source community.
For exhibition and sponsorship opportunities, contact Sharon Cordesse at email@example.com
For information on trade opportunities with O'Reilly conferences contact firstname.lastname@example.org
For media-related inquiries, contact Maureen Jennings at email@example.com
View a complete list of OSCON contacts