Computing alignments of event data and process models

Sebastiaan J. van Zelst, Alfredo Bolt, Boudewijn F. van Dongen

Research output: Chapter in Book/Report/Conference proceedingChapterAcademicpeer-review

3 Citations (Scopus)
1 Downloads (Pure)

Abstract

The aim of conformance checking is to assess whether a process model and event data, recorded in an event log, conform to each other. In recent years, alignments have proven extremely useful for calculating conformance statistics. Computing optimal alignments is equivalent to solving a shortest path problem on the state space of the synchronous product net of a process model and event data. State-of-the-art alignment based conformance checking implementations exploit the -algorithm, a heuristic search method for shortest path problems, and include a wide range of parameters that likely influence their performance. In previous work, we presented a preliminary and exploratory analysis of the effect of these parameters. This paper extends the aforementioned work by means of large-scale statistically-sound experiments that describe the effects and trends of these parameters for different populations of process models. Our results show that, indeed, there exist parameter configurations that have a significant positive impact on alignment computation efficiency.

Original languageEnglish
Title of host publicationTransactions on Petri Nets and Other Models of Concurrency XIII
EditorsMaciej Koutny, Lars Michael Kristensen, Wojciech Penczek
Place of PublicationBerlin
PublisherSpringer
Pages1-26
Number of pages26
ISBN (Print)978-3-662-58380-7, 978-3-662-58381-4
DOIs
Publication statusPublished - 1 Jan 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11090 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Keywords

  • Alignments
  • Conformance checking
  • Process mining

Fingerprint

Dive into the research topics of 'Computing alignments of event data and process models'. Together they form a unique fingerprint.

Cite this