KNP89: Kinetics of nonhomogeneous processes (KNP) and nonlinear dynamics

1990 ◽  
Vol 68 (9) ◽  
pp. 655-659 ◽  
Author(s):  
Gordon R. Freeman

Most systems in nature are nonhomogeneous: at least one component is not distributed homogeneously. A nonhomogeneous process is one that occurs in a nonhomogeneous system. The new subject Kinetics of Nonhomogeneous Processes (KNP) deals with processes in all aspects of nature. Examples range from quantum mechanics to membranes and to the evolution of the large scale structure of the universe. Models dealing with molecules and galaxies have common features: each has translational and rotational motions as an entity, and it has internal motions that are governed by the internal masses, fields and energies. To make progress towards understanding behaviour of a complex system, simplifications are needed. The formulation of a model requires identification of essential features of the behaviour, finding correlations between the features, and then representing them by equations. There is interplay between the structure of a system and the kinetics of processes that occur in it. To identify essential elements of any process we visualize it. Visualization is a powerful tool, especially the visualization of behaviour represented by equations. Models of KNP are of two general kinds: deterministic and stochastic. Deterministic models involve an assumption that events are the inevitable result of preceding conditions. The mathematical description of KNP involves nonlinear equations. Part of the physics community has therefore come to speak of nonlinear dynamics, which is a subdivision of KNP. An example of a deterministic model is the time dependent Landau–Ginzberg equation and modifications of it, which apply to pattern formation, self-localization, instabilities, and chaos. A stochastic model involves a step-by-step process and considers the probability of events as a function of time. In a system that contains many zones and each zone contains a relatively small number of entities, for example two ion–electron pairs or a group of seven galaxies, the probable reaction or deflection of any given entity is strongly affected by the actual number of entities initially in the zone, by their relative motions and separation distances, and by the forces that act between and within them. Scientific literacy in the future will require an understanding of both deterministic and stochastic models.

2018 ◽  
Author(s):  
Sean Sheppard ◽  
Duygu Dikicioglu

AbstractKiller yeasts are microorganisms, which can produce and secrete proteinaceous toxins, a characteristic gainedviainfection by a virus. These toxins are able to kill sensitive cells of the same or a related species. From a biotechnological perspective, killer yeasts have been considered as beneficial due to their antifungal/antimicrobial activity, but also regarded as problematic for large-scale fermentation processes, whereby those yeasts would kill species off starter cultures and lead to stuck fermentations. Here, we propose a mechanistic model of the toxin-binding kinetics pertaining to the killer population coupled with the toxin-induced death kinetics of the sensitive population to study toxic actionin silico. Our deterministic model explains how killerSaccharomyces cerevisiaecells distress and consequently kill the sensitive members of the species, accounting for the K1, K2 and K28 toxin mode of action at high or low concentrations. The dynamic model captured the transient toxic activity starting from the introduction of killer cells into the culture at the time of inoculation through to induced cell death, and allowed us to gain novel insight on these mechanisms. The kinetics of K1/K2 activityviaits primary pathway of toxicity was 5.5 times faster than its activity at low concentration inducing the apoptotic pathway in sensitive cells. Conversely, we showed that the primary pathway for K28 was approximately 3 times slower than its equivalent apoptotic pathway, indicating the particular relevance of K28 in biotechnological applications where the toxin concentration is rarely above those limits to trigger the primary pathway of killer activity.


Author(s):  
Eric Y. Hu ◽  
Jean-Marie C. Bouteiller ◽  
Dong Song ◽  
Michel Baudry ◽  
Theodore W. Berger

2015 ◽  
Vol 72 (1) ◽  
pp. 55-74 ◽  
Author(s):  
Qiang Deng ◽  
Boualem Khouider ◽  
Andrew J. Majda

Abstract The representation of the Madden–Julian oscillation (MJO) is still a challenge for numerical weather prediction and general circulation models (GCMs) because of the inadequate treatment of convection and the associated interactions across scales by the underlying cumulus parameterizations. One new promising direction is the use of the stochastic multicloud model (SMCM) that has been designed specifically to capture the missing variability due to unresolved processes of convection and their impact on the large-scale flow. The SMCM specifically models the area fractions of the three cloud types (congestus, deep, and stratiform) that characterize organized convective systems on all scales. The SMCM captures the stochastic behavior of these three cloud types via a judiciously constructed Markov birth–death process using a particle interacting lattice model. The SMCM has been successfully applied for convectively coupled waves in a simplified primitive equation model and validated against radar data of tropical precipitation. In this work, the authors use for the first time the SMCM in a GCM. The authors build on previous work of coupling the High-Order Methods Modeling Environment (HOMME) NCAR GCM to a simple multicloud model. The authors tested the new SMCM-HOMME model in the parameter regime considered previously and found that the stochastic model drastically improves the results of the deterministic model. Clear MJO-like structures with many realistic features from nature are reproduced by SMCM-HOMME in the physically relevant parameter regime including wave trains of MJOs that organize intermittently in time. Also one of the caveats of the deterministic simulation of requiring a doubling of the moisture background is not required anymore.


2021 ◽  
Vol 20 (5) ◽  
pp. 1-34
Author(s):  
Edward A. Lee

This article is about deterministic models, what they are, why they are useful, and what their limitations are. First, the article emphasizes that determinism is a property of models, not of physical systems. Whether a model is deterministic or not depends on how one defines the inputs and behavior of the model. To define behavior, one has to define an observer. The article compares and contrasts two classes of ways to define an observer, one based on the notion of “state” and another that more flexibly defines the observables. The notion of “state” is shown to be problematic and lead to nondeterminism that is avoided when the observables are defined differently. The article examines determinism in models of the physical world. In what may surprise many readers, it shows that Newtonian physics admits nondeterminism and that quantum physics may be interpreted as a deterministic model. Moreover, it shows that both relativity and quantum physics undermine the notion of “state” and therefore require more flexible ways of defining observables. Finally, the article reviews results showing that sufficiently rich sets of deterministic models are incomplete. Specifically, nondeterminism is inescapable in any system of models rich enough to encompass Newton’s laws.


Author(s):  
Alicia L. Jurek ◽  
Matthew C. Matusiak ◽  
Randa Embry Matusiak

Purpose The current research explores the structural elaboration of municipal American police organizations, specifically, the structural complexity of police organizations and its relationship to time. The purpose of this paper is to describe and test essential elements of the structural elaboration hypothesis. Design/methodology/approach The authors explore the structural elaboration hypothesis utilizing a sample of 219 large police departments across the USA. Data are drawn from multiple waves of the Law Enforcement Management and Administrative Statistics survey and are analyzed using tobit and OLS regression techniques. Findings While there is some evidence that police departments are becoming more elaborate, little evidence for the structural elaboration hypothesis as a function of time is found. Originality/value This project is the first to specifically explore the structural elaboration hypothesis across multiple time points. Additionally, results highlight structural trends across a panel of large American police organizations and provide potential explanations for changes. Suggestions for large-scale policing data collection are also provided.


1990 ◽  
Vol 329 (1255) ◽  
pp. 369-373 ◽  

We tried to develop deterministic models for kinetics of 2,4-D breakdown in the soil based on the following considerations: (i) at low concentrations degradation results from maintenance consumption by a large fraction of the soil microbial population; (ii) at high concentration in addition to the maintenance consumption there is a growth-associated carbon incorporation by a small specific microbial population. Values for the biokinetic parameters are consistent with those commonly found in the literature. Comparison between observed and simulated curves suggests that a non-negligible part of the pesticidal carbon exists as microbial by-products.


2003 ◽  
Vol 474 ◽  
pp. 299-318 ◽  
Author(s):  
JACQUES VANNESTE

The weakly nonlinear dynamics of quasi-geostrophic flows over a one-dimensional, periodic or random, small-scale topography is investigated using an asymptotic approach. Averaged (or homogenized) evolution equations which account for the flow–topography interaction are derived for both homogeneous and continuously stratified quasi-geostrophic fluids. The scaling assumptions are detailed in each case; for stratified fluids, they imply that the direct influence of the topography is confined within a thin bottom boundary layer, so that it is through a new bottom boundary condition that the topography affects the large-scale flow. For both homogeneous and stratified fluids, a single scalar function entirely encapsulates the properties of the topography that are relevant to the large-scale flow: it is the correlation function of the topographic height in the homogeneous case, and a linear transform thereof in the continuously stratified case.Some properties of the averaged equations are discussed. Explicit nonlinear solutions in the form of one-dimensional travelling waves can be found. In the homogeneous case, previously studied by Volosov, they obey a second-order differential equation; in the stratified case on which we focus they obey a nonlinear pseudodifferential equation, which reduces to the Peierls–Nabarro equation for sinusoidal topography. The known solutions to this equation provide examples of nonlinear periodic and solitary waves in continuously stratified fluid over topography.The influence of bottom topography on large-scale baroclinic instability is also examined using the averaged equations: they allow a straightforward extension of Eady's model which demonstrates the stabilizing effect of topography on baroclinic instability.


Author(s):  
Albert Weale

In the twilight of utilitarianism contract theorist sought to respond to the problems that utilitarianism had thrown up. How successful were they? Our review of contract theory has shown that it is not possible to base a contract theory on a utility theory of rationality, even though some have claimed that such a theory states the essential elements of rational behaviour. The axioms of utility theory are controversial in themselves, and do not give an account of prudence. To have an account of prudence, we need to turn to the deliberative account of rationality, and the idea of intelligibility. The practical syllogism will only take us so far, however, and will not deal with cases where interests conflict. There is no need to make a sharp distinction between contract theories in which there is a plurality of agents, without a veil of ignorance, and a single agent behind a veil of ignorance. The singular veil of ignorance construction can be regarded as a more abstract thought experiment in situation of moral perplexity. Similarly, the distinction between mutual advantage theories, which involve essential reference to a baseline of non-cooperation, and baseline independent theories is not clear, since much depends on the character of the baseline. The problem of obligation remains unresolved, but its lack of resolution underlines a conclusion of Hart to the effect that coercion is an essential element of a large-scale society.


Sign in / Sign up

Export Citation Format

Share Document