The adoption of machine learning and statistical models to solve real-world problems has increased exponentially, but users still struggle to derive the full potential of the predictive models. There is a dichotomy between explainability and model performance while making the choice of the algorithm. Linear models and simple decision trees are often preferred over more complex models such as ensembles or deep learning when operationalizing models for ease of interpretation, which often results in a loss of accuracy. But is it necessary to accept a trade-off between model complexity and interpretability?
Being able to faithfully interpret a model globally, using partial dependence plots (PDP) and relative feature importance, and locally, using local interpretable model-agnostic interpretation (LIME), helps in understanding feature contribution on predictions and model variability in a nonstationary environment. This enables trust in the algorithm, which drives better collaboration and communication among peers. And the need to understand the variability in the predictive power of a model in human-interpretable way is even more important for complex models (e.g., text, images, and machine translations).
Pramit Choudhary offers an overview of Datascience.com’s model interpretation library Skater, explains how to use it to evaluate models using the Jupyter environment, and shares how it could help analysts, data scientists, and statisticians better understand their model behavior—without compromising on the choice of algorithm.
This session is sponsored by DataScience.com.
Pramit Choudhary is a lead data scientist at DataScience.com, where he focuses on optimizing and applying classical machine learning and Bayesian design strategy to solve real-world problems. Currently, he is leading initiatives on figuring out better ways to explain a model’s learned decision policies to reduce the chaos in building effective models and close the gap between a prototype and operationalized model.
©2017, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • firstname.lastname@example.org