StrategyAtlas: Strategy Analysis for Machine Learning Interpretability

Dennis Collaris (Corresponding author), Jarke J. van Wijk

Research output: Contribution to journalArticleAcademicpeer-review

9 Citations (Scopus)
92 Downloads (Pure)

Abstract

Businesses in high-risk environments have been reluctant to adopt modern machine learning approaches due to their complex and uninterpretable nature. Most current solutions provide local, instance-level explanations, but this is insufficient for understanding the model as a whole. In this work, we show that strategy clusters (i.e., groups of data instances that are treated distinctly by the model) can be used to understand the global behavior of a complex ML model. To support effective exploration and understanding of these clusters, we introduce StrategyAtlas, a system designed to analyze and explain model strategies. Furthermore, it supports multiple ways to utilize these strategies for simplifying and improving the reference model. In collaboration with a large insurance company, we present a use case in automatic insurance acceptance, and show how professional data scientists were enabled to understand a complex model and improve the production model based on these insights.
Original languageEnglish
Pages (from-to)2996-3008
Number of pages13
JournalIEEE Transactions on Visualization and Computer Graphics
Volume29
Issue number6
DOIs
Publication statusPublished - 1 Jun 2023

Keywords

  • Visualization
  • Visual analytics
  • Machine learning
  • Explainable AI
  • Analytical models
  • Computational modeling
  • Data visualization
  • Insurance
  • Predictive models
  • Data models
  • explainable AI
  • machine learning

Fingerprint

Dive into the research topics of 'StrategyAtlas: Strategy Analysis for Machine Learning Interpretability'. Together they form a unique fingerprint.

Cite this