TensorFlow Probability (TFP)
TensorFlow Probability (TFP) is a TF/Python library offering a modern take on both emerging and traditional probability and statistical tools. Statisticians and data scientists will find R-like capabilities that naturally leverage modern hardware, while ML researchers and practitioners will find powerful building blocks for specifying and learning deep probabilistic models. Josh Dillon introduces core TFP abstractions and demos some of its modeling power and convenience.
Deep learning for fundamental sciences using high-performance computing
The fundamental sciences (including particle physics and cosmology) generate exabytes of data from complex instruments and analyze these to uncover the secrets of the universe. Deep learning is enabling the direct exploitation of higher-dimensional instrument data than was previously possible, improving the sensitivity for new discoveries. Wahid Bhimji describes recent activity in this field, particularly that undertaken by NERSC, the mission supercomputing center for US fundamental science, based at the Berkeley National Lab. This work exploits and builds on TensorFlow to explore novel methods and applications, exploit high-performance computing scales, and provide productive deep learning environments for fundamental scientists.
This session is sponsored by Google.
Joshua V. Dillon is a software engineer for research and machine intelligence at Google. His research interests include approximate inference techniques for probabilistic models, uncertainty in machine learning, and designing probabilistic programming tools and languages. He holds a PhD in machine learning from the Georgia Institute of Technology. In his free time, Josh enjoys spending time with his family, cycling, and woodworking.
Wahid Bhimji is a big data architect at the NERSC supercomputing center based at the Berkeley National Laboratory. His interests in AI include generative models and deep learning applied to fundamental science such as high-energy physics. He is also involved in optimizing other aspects of scientific big data workflows running on high-performance computing resources. Previously, Wahid was heavily involved in data management and analysis for the Large Hadron Collider at CERN and the UK government. He holds a PhD in high-energy particle physics.
©2018, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com