Put AI to Work
April 15-18, 2019
New York, NY
Please log in

Using topological data analysis to understand, build, and improve neural networks

Gunnar Carlsson (Ayasdi)
9:00am12:30pm Tuesday, April 16, 2019
Models and Methods
Location: Sutton South
Secondary topics:  Deep Learning and Machine Learning tools, Models and Methods

Who is this presentation for?

  • CDOs, data scientists, and data engineers

Level

Advanced

Prerequisite knowledge

  • A solid understanding of neural networks
  • Familiarity with MNIST and SVHN (useful but not required)

What you'll learn

  • Explore methods for determining how and what neural nets learn and for dramatically improving the performance and accuracy of neural nets
  • Learn a framework for generalizing and adapting efficient architectures for neural nets based on the underlying data

Description

Neural networks have demonstrated a great deal of success in the study of various kinds of data, including images, text, time series, and many others. One issue that restricts their applicability, however, is the fact that we don’t understand in any kind of detail how they work. A related problem is that there’s often a certain kind of overfitting to particular datasets, which results in the possibility of adversarial behavior. For these reasons, methods for developing some understanding of the internal states of the neural networks are desirable.

Gunnar Carlsson explains how to use topological data analysis (TDA) to describe the functioning and learning of a neural network in a compact and understandable way—resulting in material speedups in performance (training time and accuracy) and enabling data-type customization of neural network architectures to further boost performance and widen the applicability of the method to all datasets. Gunnar demonstrates how to take much more general data types and adapt the architecture to them, showing you how to construct an architecture for a single dataset—a custom neural network.

Gunnar concludes with a discussion of how TDA can be used to persist derived or learned features. Where applied, this approach delivers a ~2x speedup on the MNIST dataset. For the more complicated SVHN dataset, the performance is even more substantial, roughly 3.5x until hitting the .8 threshold, and lowering to 2.5–3x thereafter. An examination of the graphs of validation accuracy versus the number of batch iterations, plotted to 30,000 iterations, suggests that at the higher accuracy ranges, the standard method may never attain the results of the TDA-boosted method.

The implications of this work are broad and considerable as it illuminates the black box of neural nets while simultaneously improving the performance and, finally, creating a roadmap for even more performance enhancements. Just the first finding (how and what NNs learn) will have broad operational implications for data.

Photo of Gunnar Carlsson

Gunnar Carlsson

Ayasdi

Gunnar Carlsson is a professor of mathematics (emeritus) at Stanford University and cofounder and president at Ayasdi, which is commercializing products based on machine intelligence and topological data analysis. Gunnar has spent his career devoted to the study of topology, the mathematical study of shape. Originally, his work focused on the pure aspects of the field, but in 2000 he began work on the applications of topology to the analysis of large and complex datasets, which led to a number of projects, notably a multi-university initiative funded by the Defense Advanced Research Projects Agency. He has taught at the University of Chicago, the University of California, San Diego, Princeton University, and, since 1991, Stanford University, where he has served as the chair of the Mathematics Department. He is also a founder of the ATMCS series of conferences focusing on the applications of topology and a founding editor of the Journal for Applied and Computational Topology. Gunnar is the author of over 100 academic papers and has given numerous addresses to scholarly meetings. He holds a BA in mathematics from Harvard and a PhD in mathematics from Stanford. He is married with three grown children.