scholarly journals Engineering a seven enzyme biotransformation using mathematical modelling and characterized enzyme parts

2019 ◽  
Author(s):  
William Finnigan ◽  
Rhys Cutlan ◽  
Radka Snajdrova ◽  
Joseph P. Adams ◽  
Jennifer A. Littlechild ◽  
...  

AbstractMulti-step enzyme reactions offer considerable cost and productivity benefits. Process models offer a route to understanding the complexity of these reactions, and allow for their optimization. Despite the increasing prevalence of multi-step biotransformations, there are few examples of process models for enzyme reactions. From a toolbox of characterized enzyme parts, we demonstrate the construction of a process model for a seven enzyme, three step biotransformation using isolated enzymes. Enzymes for cofactor regeneration were employed to make thisin vitroreaction economical. Good modelling practice was critical in evaluating the impact of approximations and experimental error. We show that the use and validation of process models was instrumental in realizing and removing process bottlenecks, identifying divergent behavior, and for the optimization of the entire reaction using a genetic algorithm. We validated the optimized reaction to demonstrate that complex multi-step reactions with cofactor recycling involving at least seven enzymes can be reliably modelled and optimized.Significance statementThis study examines the challenge of modeling and optimizing multi-enzyme cascades. We detail the development, testing and optimization of a deterministic model of a three enzyme cascade with four cofactor regeneration enzymes. Significantly, the model could be easily used to predict the optimal concentrations of each enzyme in order to get maximum flux through the cascade. This prediction was strongly validated experimentally. The success of our model demonstrates that robust models of systems of at least seven enzymes are readily achievable. We highlight the importance of following good modeling practice to evaluate model quality and limitations. Examining deviations from expected behavior provided additional insight into the model and enzymes. This work provides a template for developing larger deterministic models of enzyme cascades.

Computing ◽  
2021 ◽  
Author(s):  
Mohammadreza Fani Sani ◽  
Sebastiaan J. van Zelst ◽  
Wil M. P. van der Aalst

AbstractWith Process discovery algorithms, we discover process models based on event data, captured during the execution of business processes. The process discovery algorithms tend to use the whole event data. When dealing with large event data, it is no longer feasible to use standard hardware in a limited time. A straightforward approach to overcome this problem is to down-size the data utilizing a random sampling method. However, little research has been conducted on selecting the right sample, given the available time and characteristics of event data. This paper systematically evaluates various biased sampling methods and evaluates their performance on different datasets using four different discovery techniques. Our experiments show that it is possible to considerably speed up discovery techniques using biased sampling without losing the resulting process model quality. Furthermore, due to the implicit filtering (removing outliers) obtained by applying the sampling technique, the model quality may even be improved.


Author(s):  
Jan Claes ◽  
Irene Vanderfeesten ◽  
Hajo A. Reijers ◽  
Jakob Pinggera ◽  
Matthias Weidlich ◽  
...  

2016 ◽  
Vol 6 (1) ◽  
Author(s):  
Markus Martini ◽  
Jakob Pinggera ◽  
Manuel Neurauter ◽  
Pierre Sachse ◽  
Marco R. Furtner ◽  
...  

Abstract A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.


Author(s):  
Stefan Zugal ◽  
Cornelia Haisjackl ◽  
Jakob Pinggera ◽  
Barbara Weber

Declarative approaches to process modeling are regarded well suited for highly volatile environments as they provide a high degree of flexibility. However, problems in understanding and maintaining declarative process models impede their usage. To compensate for these shortcomings, Test Driven Modeling (TDM) has been proposed. This paper reports on an empirical investigation in which TDM is viewed from two different angles. First, the impact of TDM on communication is explored in a case study. Results indicate that domain experts are inclined to use test cases for communicating with the model builder (system analyst) and prefer them over the process model. The second part of the investigation, a controlled experiment, investigates the impact of TDM on process model maintenance. Data gathered in this experiment indicates that the adoption of test cases significantly lowers cognitive load and increases the perceived quality of changes.


2021 ◽  
Author(s):  
Tom Mooney ◽  
Kelda Bratley ◽  
Amin Amin ◽  
Timothy Jadot

Abstract The use of conventional process simulators is commonplace for system design and is growing in use for online monitoring and optimization applications. While these simulators are extremely useful, additional value can be extracted by combining simulator predictions with field inputs from measurement devices such as flowmeters, pressure and temperature sensors. The statistical nature of inputs (e.g., measurement uncertainty) are typically not considered in the forward calculations performed by the simulators and so may lead to erroneous results if the actual raw measurement is in error or biased. A complementary modeling methodology is proposed to identify and correct measurement and process errors as an integral part of a robust simulation practice. The studied approach ensures best quality data for direct use in the process models and simulators for operations and process surveillance. From a design perspective, this approach also makes it possible to evaluate the impact of uncertainty of measured and unmeasured variables on CAPEX spend and optimize instrument / meter design. In this work, an extended statistical approach to process simulation is examined using Data Validation and Reconciliation, (DVR). The DVR methodology is compared to conventional non-statistical, deterministic process simulators. A key difference is that DVR uses any measured variable (inlet, outlet, or in between measurements), including its uncertainty, in the modelled process as an input, where only inlet measurement values are used by traditional simulators to estimate the values of all other measured and unmeasured variables. A walk through the DVR calculations and applications is done using several comparative case studies of a typical surface process facility. Examples are the simulation of commingled multistage oil and gas separation process, the validation of separators flowmeters and fluids samples, and the quantification of unmeasured variables along with their uncertainties. The studies demonstrate the added value from using redundancy from all available measurements in a process model based on the DVR method. Single points and data streaming field cases highlight the dependency and complementing roles of traditional simulators, and data validation provided by the DVR methodology; it is shown how robust measurement management strategies can be developed based on DVR's effective surveillance capabilities. Moreover, the cases demonstrate how DVR-based capex and opex improvements are derived from effective hardware selection using cost versus measurement precision trade-offs, soft measurements substitutes, and from condition-based maintenance strategies.


1993 ◽  
Vol 18 (1) ◽  
pp. 5-21 ◽  
Author(s):  
Norris Krueger

Shapero (1975, 1982) proposed an Intentionality-based process model of the entrepreneurial event. Entrepreneurial intentions should derive from feasibility and desirability perceptions plus a propensity to act on opportunities. Prior entrepreneurship-related experiences should influence entrepreneurial intentions indirectly through these perceptions. Path analyses found that feasibility and desirability perceptions and propensity to act each proved significant antecedents of entrepreneurial intentions. Perceived feasibility was significantly associated with the breadth of prior exposure. Perceived desirability was significantly associated with the positiveness of that prior exposure. Strong support was found for Shapero's model, arguing for further application of intentions-based process models of entrepreneurial activity.


2011 ◽  
Vol 8 (65) ◽  
pp. 1796-1803 ◽  
Author(s):  
Mateusz M. Pluciński ◽  
Calistus N. Ngonghala ◽  
Matthew H. Bonds

The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a ‘safety net’, defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium.


10.2196/15374 ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. e15374 ◽  
Author(s):  
Michael Winter ◽  
Rüdiger Pryss ◽  
Thomas Probst ◽  
Manfred Reichert

Background The management and comprehension of business process models are of utmost importance for almost any enterprise. To foster the comprehension of such models, this paper has incorporated the idea of a serious game called Tales of Knightly Process. Objective This study aimed to investigate whether the serious game has a positive, immediate, and follow-up impact on process model comprehension. Methods A total of two studies with 81 and 64 participants each were conducted. Within the two studies, participants were assigned to a game group and a control group (ie, study 1), and a follow-up game group and a follow-up control group (ie, study 2). A total of four weeks separated study 1 and study 2. In both studies, participants had to answer ten comprehension questions on five different process models. Note that, in study 1, participants in the game group played the serious game before they answered the comprehension questions to evaluate the impact of the game on process model comprehension. Results In study 1, inferential statistics (analysis of variance) revealed that participants in the game group showed a better immediate performance compared to control group participants (P<.001). A Hedges g of 0.77 also indicated a medium to large effect size. In study 2, follow-up game group participants showed a better performance compared to participants from the follow-up control group (P=.01); here, a Hedges g of 0.82 implied a large effect size. Finally, in both studies, analyses indicated that complex process models are more difficult to comprehend (study 1: P<.001; study 2: P<.001). Conclusions Participants who played the serious game showed better performance in the comprehension of process models when comparing both studies.


2019 ◽  
Author(s):  
Michael Winter ◽  
Rüdiger Pryss ◽  
Thomas Probst ◽  
Manfred Reichert

BACKGROUND The management and comprehension of business process models are of utmost importance for almost any enterprise. To foster the comprehension of such models, this paper has incorporated the idea of a serious game called Tales of Knightly Process. OBJECTIVE This study aimed to investigate whether the serious game has a positive, immediate, and follow-up impact on process model comprehension. METHODS A total of two studies with 81 and 64 participants each were conducted. Within the two studies, participants were assigned to a game group and a control group (ie, study 1), and a follow-up game group and a follow-up control group (ie, study 2). A total of four weeks separated study 1 and study 2. In both studies, participants had to answer ten comprehension questions on five different process models. Note that, in study 1, participants in the game group played the serious game before they answered the comprehension questions to evaluate the impact of the game on process model comprehension. RESULTS In study 1, inferential statistics (analysis of variance) revealed that participants in the game group showed a better immediate performance compared to control group participants (<italic>P</italic>&lt;.001). A Hedges g of 0.77 also indicated a medium to large effect size. In study 2, follow-up game group participants showed a better performance compared to participants from the follow-up control group (<italic>P</italic>=.01); here, a Hedges g of 0.82 implied a large effect size. Finally, in both studies, analyses indicated that complex process models are more difficult to comprehend (study 1: <italic>P</italic>&lt;.001; study 2: <italic>P</italic>&lt;.001). CONCLUSIONS Participants who played the serious game showed better performance in the comprehension of process models when comparing both studies.


Sign in / Sign up

Export Citation Format

Share Document