Neural networks have demonstrated a great deal of success in the study of various kinds of data, including images, text, time series, and many others. One issue that restricts their applicability, however, is the fact that we don’t understand in any kind of detail how they work. A related problem is that there’s often a certain kind of overfitting to particular datasets, which results in the possibility of adversarial behavior. For these reasons, methods for developing some understanding of the internal states of the neural networks are desirable.
Gunnar Carlsson explains how to use topological data analysis (TDA) to describe the functioning and learning of a neural network in a compact and understandable way—resulting in material speedups in performance (training time and accuracy) and enabling data-type customization of neural network architectures to further boost performance and widen the applicability of the method to all datasets. Gunnar demonstrates how to take much more general data types and adapt the architecture to them, showing you how to construct an architecture for a single dataset—a custom neural network.
Gunnar concludes with a discussion of how TDA can be used to persist derived or learned features. Where applied, this approach delivers a ~2x speedup on the MNIST dataset. For the more complicated SVHN dataset, the performance is even more substantial, roughly 3.5x until hitting the .8 threshold, and lowering to 2.5–3x thereafter. An examination of the graphs of validation accuracy versus the number of batch iterations, plotted to 30,000 iterations, suggests that at the higher accuracy ranges, the standard method may never attain the results of the TDA-boosted method.
The implications of this work are broad and considerable as it illuminates the black box of neural nets while simultaneously improving the performance and, finally, creating a roadmap for even more performance enhancements. Just the first finding (how and what NNs learn) will have broad operational implications for data.
Gunnar Carlsson is a professor of mathematics (emeritus) at Stanford University and cofounder and president at Ayasdi, which is commercializing products based on machine intelligence and topological data analysis. Gunnar has spent his career devoted to the study of topology, the mathematical study of shape. Originally, his work focused on the pure aspects of the field, but in 2000 he began work on the applications of topology to the analysis of large and complex datasets, which led to a number of projects, notably a multi-university initiative funded by the Defense Advanced Research Projects Agency. He has taught at the University of Chicago, the University of California, San Diego, Princeton University, and, since 1991, Stanford University, where he has served as the chair of the Mathematics Department. He is also a founder of the ATMCS series of conferences focusing on the applications of topology and a founding editor of the Journal for Applied and Computational Topology. Gunnar is the author of over 100 academic papers and has given numerous addresses to scholarly meetings. He holds a BA in mathematics from Harvard and a PhD in mathematics from Stanford. He is married with three grown children.
Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?
Join the conversation here (requires login)
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com