Neural networks have driven breakthrough results in computer vision, speech processing, machine translation, and reinforcement learning. They are powerful function approximators and are an elegant and fascinating family of algorithms.
Laura Graesser offers a hands-on introduction to neural networks using the popular Python library Keras, focusing on what neural networks are, why they are powerful algorithms, and why they have a particular structure. Laura begins by introducing the core components of a neural network, nodes, weights and biases, activation functions, and layers before walking you through building a deep feed-forward network step by step. Along the way, Laura explains how a neural network learns and covers the backpropagation algorithm.
You’ll then build a neural network in Keras to try to solve is a classic classification problem: identifying handwritten digits from grayscale images. Laura concludes by exploring the challenges that arise when training neural networks, with a focus on overfitting, as well as a potential remedy: regularization.
Laura Graesser is a graduate student at New York University, where she is working toward a master’s degree in computer science with a focus on machine learning. In her spare time, Laura enjoys experimenting with and writing about machine learning techniques. Laura is particularly interested in neural networks and their application to computer vision problems, cross-fertilization between computer vision and NLP, and the representations perspective.
Comments on this page are now closed.
©2017, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com