The adoption of machine learning to solve real-world problems has increased exponentially, but users still struggle to derive full potential of the predictive models. It is no longer sufficient to evaluate a model’s accurate prediction just on a validation set based on error metrics. However, there is still a dichotomy between explainability and model performance when choosing an algorithm. Linear models and simple decision trees are often preferred over more complex models such as ensembles or deep learning models for ease of interpretation, but this often results in loss in accuracy. However, is it actually necessary to accept a trade-off between model complexity and interpretability?
Pramit Choudhary explores the usefulness of a generative approach that applies Bayesian inference to generate human-interpretable decision sets in the form of “if. . .and else” statements. These human interpretable decision lists with high posterior probabilities might be the right way to balance between model interpretability, performance, and computation. This is an extension of DataScience.com’s ongoing effort to enable trust in predictive algorithms to drive better collaboration and communication among peers. Pramit also outlines DataScience.com’s open source model interpretation framework, Skater, and explains how it helps practitioners understand model behavior better without compromising on the choice of algorithm.
Pramit Choudhary is a Lead data scientist/ML scientist at h2o.ai, where he focuses on optimizing and applying classical machine learning and Bayesian design strategy to solve large scale real-world problems.
Currently, he is leading initiatives on figuring out better ways to generate a predictive model’s learned decision policies as meaningful insights(Supervised/Unsupervised problems)
Comments on this page are now closed.
©2018, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • firstname.lastname@example.org