Presented at the recent O’Reilly OSCON – Open Source Convention 2014 by Josh Patterson (Patterson Consulting) and Adam Gibson (Skymind.io) is “Introduction to Parallel Iterative Deep Learning on Hadoop’s Next-Generation YARN Framework.”
Online learning techniques, such as Stochastic Gradient Descent (SGD) and Contrastive Divergence, have proven quite useful in applied machine learning. However, their sequential design prevents them from taking advantage of newer distributed frameworks such as Hadoop / MapReduce. This talk reviews how to parallelize Deep Belief Networks in Deep Learning on the nextgen YARN framework Iterative Reduce and the parallel machine learning library Metronome. You’ll also take a look at some real world applications of Deep Learning on Hadoop such as image classification and NLP.
The valuable slides for the presentation are available HERE.
Download the latest YARN white papers from the insideBIGDATA White Paper Library.