Presented By
O’Reilly + Cloudera
Make Data Work
March 25-28, 2019
San Francisco, CA
Please log in

Deploying data science for national economic statistics

Jeff Chen (US Bureau of Economic Analysis)
11:50am12:30pm Thursday, March 28, 2019
Average rating: ****.
(4.50, 2 ratings)

Who is this presentation for?

  • Data scientists, social scientists, and data strategists

Level

Intermediate

Prerequisite knowledge

  • Familiarity with time series

What you'll learn

  • Understand that there are certain algorithms that are good for economies in steady state versus greater volatility
  • Learn why your best bet is to use model averaging and why you should beware of spurious correlations

Description

Time series analysis sits at the heart of macroeconomics. It provides the trends and context of where the economy is moving, but the nature of the game is changing. Some data that feeds indicators may only be available after the preliminary release of macroeconomic numbers, necessitating the use of short-term prediction to plug gaps. In recent memory, modelers have turned to more timely alternative data and other sources that sit outside of macroeconomic theory to help boost the signal of the economy. This, in turn, raises both technical and theoretical challenges. By increasing dimensionality using modern data sources, the time series shrinks. Furthermore, input features may not be drawn directly to establish macroeconomic frameworks.

Jeff Chen shares strategies for overcoming these time series challenges at the intersection of macroeconomics and data science, drawing from machine learning research conducted at the Bureau of Economic Analysis aimed at improving its flagship product the gross domestic product. Jeff presents this work in the context of a comprehensive horse race of algorithms, data sources, and feature selection methods that have yielded best practices for the field. He offers a number of practical tips for how social scientists can incorporate time series prediction into their work, covering core issues that underlie the success and failures in this space, namely how basic assumptions of algorithms affect their ability to detect regime change as well as the need to distinguish accuracy gains from each data and models.

Photo of Jeff Chen

Jeff Chen

US Bureau of Economic Analysis

Jeff Chen is the chief innovation officer at the US Bureau of Economic Analysis, where he’s responsible for integrating advancements in data science and machine learning to advance the bureau’s capabilities. A statistician and data scientist, Jeff has extensive experience in launching and leading data science initiatives in over 40 domains, working with diverse stakeholders such as firefighters, climatologists, and technologists, among others to introduce data science and new technologies to advance their missions. Previously, he was the chief scientist at the US Department of Commerce; a White House Presidential Innovation Fellow with NASA and the White House Office of Science and Technology Policy, focused on data science for the environment; the first director of analytics at the NYC Fire Department, where he engineered pioneering algorithms for fire prediction; and on of the first data scientists at the NYC Mayor’s Office under then-Mayor Mike Bloomberg. Jeff started his career as an econometrician at an international engineering consultancy, where he developed forecasting and prediction models supporting for large-scale infrastructure investment projects. In the evenings, he’s an adjunct professor of data science at Georgetown University. He holds a bachelor’s degree in economics from Tufts University and a master’s degree in applied statistics from Columbia University.