Projekte pro Jahr
Process mining is a research area focusing on the design of algorithms that can automatically provide insights into business processes by analysing historic process execution data, known as event logs. Among the most popular algorithms are those for automated process discovery, whose ultimate goal is to generate the best process model that summarizes the behaviour recorded in the input event log. Over the past decade, several process discovery algorithms have been proposed but, until now, this research was driven by the implicit assumption that a better algorithm would discover better process models, no matter the characteristics of the input event log. In this paper, we take a step back and question that assumption. Specifically, we investigate what are the relations between measures capturing characteristics of the input event log and the quality of the discovered process models. To this end, we review the state-of-the-art process complexity measures, propose a new process complexity measure based on graph entropy, and analyze this set of complexity measures on an extensive collection of event logs and corresponding automatically discovered process models. Our analysis shows that many process complexity measures correlate with the quality of the discovered process models, demonstrating the potential of using complexity measures as predictors for the quality of process models discovered with state-of-the-art process discovery algorithms. This finding is important for process mining research, as it highlights that not only algorithms, but also connections between input data and output quality should be studied.
|Publikationsstatus||Veröffentlicht - 2021|
Österreichische Systematik der Wissenschaftszweige (ÖFOS)
- 502050 Wirtschaftsinformatik
- 1 Abgeschlossen
Erasmus+ BPM Org Theory
Mendling, J., Bala, S. & Wurm, B.
1/09/19 → 31/08/21