July 20–24, 2015
Portland, OR

Applications of hierarchal temporal memory algorithm tools to energy demand analysis

Patrick Barton (O'Reilly School of Technology)
4:10pm–4:50pm Thursday, 07/23/2015
Solve E147/148
Average rating: *****
(5.00, 1 rating)
Slides:   1-PDF 

Prerequisite Knowledge

General computer programming, algorithm development, data analysis skills would be helpful. Background in AI and electric demand forecasting would be a "nice to have".

Description

Numenta’s decision to open source their hierarchal temporal memory (HTM) application, NuPIC, has placed a powerful analytic/predictive in the hands of researchers and analysts worldwide. This talk explores the application of NuPIC in the domain space of electricity demand forecasting, by blending CLA tools/techniques with more traditional approaches taken by electric system analysts, leveraging some of the advantages of both.

HTMs attempt to mimic the structure and behavior of the human neocortex, by dynamically identifying patterns and meta-patterns in data streams, evaluating the fit between these patterns and observed reality, and adjusting the basis for predicting future events. In their pure form, they are agnostic about the data source, the information the data purport to represent, and the domain space to which they are applied. As demonstrated by Matt Taylor and Scott Purdy at OSCON 2013, NuPIC can do a credible job of electricity demand forecasting “right out of the box,” at a level of accuracy likely sufficient for many purposes.

Electricity is a very strange beast in that it is the only commodity that must be produced and consumed simultaneously. Further complicating things, retail utilities have an obligation to serve whatever demand may materialize, generally without any advance notice – in other words they have to dance to the beat of the consumers’ drum, arranging for whatever reserves might be required to meet whatever load they put on the system. Forecasting errors can lead to substantially increased costs, especially if utilities are forced into spot purchases at market price when supplies are constrained. Consequently, anything that can be done to increase forecast accuracy is potentially extremely valuable for both utilities and their customers.

This talk addresses improving HTMs’ ability to predict electricity demand by helping the algorithm by providing some of the complementarity data streams currently applied to demand analysis, and by including some goodness-of-fit metrics that address known characteristics of electric load. On the data side, we experiment with meteorological and astronomical data in analytic derivatives thereof. For instance, the load that comes from cooling demand is known to be some function of not only the current temperature, humidity, and solar radiation, but residual heat buildup. The effect of residual heat buildup, in turn, depends on myriad factors such as building utilization, thermal efficiency of building shells, etc. all of which can be analyzed off-line and provided as an algorithmically-produced supplementary data stream. The goodness-of-fit metrics can be anything the analyst wants them to be.

Though focused on electricity demand, the successes and failures presented here will be a potential value to individuals applying cortical learning algorithms to virtually any predictive demand analysis.

Photo of Patrick  Barton

Patrick Barton

O'Reilly School of Technology

Patrick Barton is an energy scientist/policy analyst with over 20 years experience developing and applying mathematical models to electrical systems, strategic demand-side management, and psychometric analysis. He is currently lead instructor at the O’Reilly School of Technology and a full-course instructor for the Python certificate series.