Understanding complex process models by abstracting infrequent behavior

2020 ◽  
Vol 113 ◽  
pp. 428-440
Author(s):  
David Chapela-Campa ◽  
Manuel Mucientes ◽  
Manuel Lama
Agronomy ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 314
Author(s):  
Andrew Revill ◽  
Vasileios Myrgiotis ◽  
Anna Florence ◽  
Stephen Hoad ◽  
Robert Rees ◽  
...  

Climate, nitrogen (N) and leaf area index (LAI) are key determinants of crop yield. N additions can enhance yield but must be managed efficiently to reduce pollution. Complex process models estimate N status by simulating soil-crop N interactions, but such models require extensive inputs that are seldom available. Through model-data fusion (MDF), we combine climate and LAI time-series with an intermediate-complexity model to infer leaf N and yield. The DALEC-Crop model was calibrated for wheat leaf N and yields across field experiments covering N applications ranging from 0 to 200 kg N ha−1 in Scotland, UK. Requiring daily meteorological inputs, this model simulates crop C cycle responses to LAI, N and climate. The model, which includes a leaf N-dilution function, was calibrated across N treatments based on LAI observations, and tested at validation plots. We showed that a single parameterization varying only in leaf N could simulate LAI development and yield across all treatments—the mean normalized root-mean-square-error (NRMSE) for yield was 10%. Leaf N was accurately retrieved by the model (NRMSE = 6%). Yield could also be reasonably estimated (NRMSE = 14%) if LAI data are available for assimilation during periods of typical N application (April and May). Our MDF approach generated robust leaf N content estimates and timely yield predictions that could complement existing agricultural technologies. Moreover, EO-derived LAI products at high spatial and temporal resolutions provides a means to apply our approach regionally. Testing yield predictions from this approach over agricultural fields is a critical next step to determine broader utility.


2014 ◽  
Vol 556-562 ◽  
pp. 4124-4127
Author(s):  
Zhao Juan ◽  
Xian Wen Fang ◽  
Xiang Wei Liu

Analysis of suspected change domain has become the key problems in the area of business process management. But when input models are large complex process models, existing methods have obvious limitations. An analysis method of the change region based on merged Petri model in the paper. Firstly, several complex models are merged into a model with identifiers. Then, an algorithm for extracting the digest from the merged model is given, and the changes are optimized based on module behavior profile. Finally, the suspected changes are analyzed by the traceability of the merged model. The theoretical analysis and specific example show the method is effective.


2018 ◽  
Vol 13 (1) ◽  
pp. 3222-3234
Author(s):  
Melanie Drägestein ◽  
Jürgen Jung ◽  
Andreas Gadatsch ◽  
Ilja Kogan

There is often no common understanding on operational processes in logistics companies as they are not properly documented. Hence, people execute the same process differently and training is conducted by experienced operators on an ad-hoc basis. Furthermore, continuous process improvement is hampered as neither the ideal process nor current issues in as-is processes are visible. A major reason for the missing documentation is the complexity of existing business process modelling languages. Modelling experts are required for initially describing the processes and also for updating the models after process changes. Furthermore, operations people are usually not used to read complex process models in EPCs or BPMN diagrams. In order to overcome these limitations, a domain-specific modelling language which facilitates maintaining up-to-date process models has been designed with a large logistics company in Germany. The paper at hand briefly describes this language and illustrates the method on how to apply it in operations environments. 


2014 ◽  
Vol 23 (03) ◽  
pp. 1430001 ◽  
Author(s):  
Aitor Murguzur ◽  
Karmele Intxausti ◽  
Aitor Urbieta ◽  
Salvador Trujillo ◽  
Goiuria Sagardui

In dynamic environments, changes are often unpredictable and complex. Process models cannot be fully specified up-front and process flexibility becomes a key issue. Enterprise applications and systems supporting such processes are increasingly being architected in a service-oriented style. In this light, our goal is to analyze service orchestration approaches from a process flexibility perspective. Through a systematic literature review, we evaluate 17 service orchestration approaches and analyze their support for: (i) variability, support for large collections of process variants, (ii) adaptation, need for instance changes during runtime, (iii) evolution, need for schema changes during runtime, and (iv) looseness, need for loosely-specified models. The review findings provide a clearer understanding of process flexibility requirements and service orchestration mechanisms that support them, helping us to understand the limitations and shed light on future research areas.


Process models are the analytical illustration of an organization’s activity. They are very primordial to map out the current business process of an organization, build a baseline of process enhancement and construct future processes where the enhancements are incorporated. To achieve this, in the field of process mining, algorithms have been proposed to build process models using the information recorded in the event logs. However, for complex process configurations, these algorithms cannot correctly build complex process structures. These structures are invisible tasks, non-free choice constructs, and short loops. The ability of each discovery algorithm in discovering the process constructs is different. In this work, we propose a framework responsible of detecting from event logs the complex constructs existing in the data. By identifying the existing constructs, one can choose the process discovery techniques suitable for the event data in question. The proposed framework has been implemented in ProM as a plugin. The evaluation results demonstrate that the constructs can correctly be identified.


10.2196/15374 ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. e15374 ◽  
Author(s):  
Michael Winter ◽  
Rüdiger Pryss ◽  
Thomas Probst ◽  
Manfred Reichert

Background The management and comprehension of business process models are of utmost importance for almost any enterprise. To foster the comprehension of such models, this paper has incorporated the idea of a serious game called Tales of Knightly Process. Objective This study aimed to investigate whether the serious game has a positive, immediate, and follow-up impact on process model comprehension. Methods A total of two studies with 81 and 64 participants each were conducted. Within the two studies, participants were assigned to a game group and a control group (ie, study 1), and a follow-up game group and a follow-up control group (ie, study 2). A total of four weeks separated study 1 and study 2. In both studies, participants had to answer ten comprehension questions on five different process models. Note that, in study 1, participants in the game group played the serious game before they answered the comprehension questions to evaluate the impact of the game on process model comprehension. Results In study 1, inferential statistics (analysis of variance) revealed that participants in the game group showed a better immediate performance compared to control group participants (P<.001). A Hedges g of 0.77 also indicated a medium to large effect size. In study 2, follow-up game group participants showed a better performance compared to participants from the follow-up control group (P=.01); here, a Hedges g of 0.82 implied a large effect size. Finally, in both studies, analyses indicated that complex process models are more difficult to comprehend (study 1: P<.001; study 2: P<.001). Conclusions Participants who played the serious game showed better performance in the comprehension of process models when comparing both studies.


Sign in / Sign up

Export Citation Format

Share Document