Supporting the sensemaking process in visual analytics

Y.B. Shrinivasan

Research output: ThesisPhd Thesis 1 (Research TU/e / Graduation TU/e)

525 Downloads (Pure)


Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces. It involves interactive exploration of data using visualizations and automated data analysis to gain insight, and to ultimately make better decisions. It aims to support the sensemaking process in which information is collected, organized and analyzed to form new knowledge and inform further action. Interactive visual exploration of the data can lead to many discoveries in terms of relations, patterns, outliers and so on. It is difficult for the human working memory to keep track of all findings during a visual analysis. Also, synthesis of many different findings and relations between those findings increase the information overload and thereby hinders the sensemaking process further. The central theme of this dissertation is How to support users in their sensemaking process during interactive exploration of data? To support the sensemaking process in visual analytics, we mainly focus on how to support users to capture, reuse, review, share, and present the key aspects of interest concerning the analysis process and the findings during interactive exploration of data. For this, we have developed generic models and tools that enable users to capture findings with provenance, and construct arguments; and to review, revise and share their visual analysis. First, we present a sensemaking framework for visual analytics that contains three linked views: a data view, a navigation view and a knowledge view for supporting the sense-making process. The data view offers interactive data visualization tools. The navigation view automatically captures the interaction history using a semantically rich action model and provides an overview of the analysis structure. The knowledge view is a basic graphics editor that helps users to record findings with provenance and to organize findings into claims using diagramming techniques. Users can exploit automatically captured interaction history and manually recorded findings to review and revise their visual analysis. Thus, the analysis process can be archived and shared with others for collaborative visual analysis. Secondly, we enable analysts to capture data selections as semantic zones during an analysis, and to reuse these zones on different subsets of data. We present a Select & Slice table that helps analysts to capture, manipulate, and reuse these zones more explicitly during exploratory data analysis. Users can reuse zones, combine zones, and compare and trace items of interest across different semantic zones and data slices. Finally, exploration overviews and searching techniques based on keywords, content similarity, and context helped analysts to develop awareness over the key aspects of the exploration concerning the analysis process and findings. On one hand, they can proactively search analysis processes and findings for reviewing purposes. On the other hand, they can use the system to discover implicit connections between findings and the current line of inquiry, and recommend these related findings during an interactive data exploration. We implemented the models and tools described in this dissertation in Aruvi and HARVEST. Using Aruvi and HARVEST, we studied the implications of these models on a user’s sensemaking process. We adopted the short-term and long-term case studies approach to study support offered by these tools for the sensemaking process. The observations of the case studies were used to evaluate the models.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • Mathematics and Computer Science
  • van Wijk, Jack J., Promotor
Award date21 Jun 2010
Place of PublicationEindhoven
Print ISBNs978-90-386-2229-3
Publication statusPublished - 2010


Dive into the research topics of 'Supporting the sensemaking process in visual analytics'. Together they form a unique fingerprint.

Cite this