scholarly journals A framework for quantifying deviations from dynamic equilibrium theory

2019 ◽  
Author(s):  
Michael Kalyuzhny ◽  
Curtis H. Flather ◽  
Nadav M. Shnerb ◽  
Ronen Kadmon

AbstractCommunity assembly is governed by colonization and extinction processes, and the simplest model describing it is Dynamic Equilibrium (DE) theory, which assumes that communities are shaped solely by stochastic colonization and extinction events. Despite its potential to serve as a null model for community dynamics, there is currently no accepted methodology for measuring deviations from the theory and testing it. Here we propose a novel and easily applicable methodology for quantifying deviations from the predictions and assumptions of DE by comparing observed community time-series to a randomization-based null model. We show that this methodology has good statistical properties on simulated data, and it can detect deviations from both the assumptions and predictions of DE in the classical Florida Keys experiment. We discuss alternative methods and present guidelines for practical use of the methodology, hoping it will enhance the applicability of DE as a reference for studying changes in ecological communities.

2019 ◽  
Author(s):  
Michael Kalyuzhny ◽  
Curtis H. Flather ◽  
Nadav M. Shnerb ◽  
Ronen Kadmon

AbstractEcological communities are assembled by colonization and extinction events, that may be regulated by ecological niches1–5. The most parsimonious explanation of local community assembly is the Dynamic Equilibrium (DE) model, which assumes that community dynamics is shaped by random colonization and extinctions events, effectively ignoring the effects of niches1, 6. Despite its empirical success in explaining diversity patterns1, 5, 7, it is unknown to what extent the assembly dynamics of communities around the globe are consistent with this model. Using a newly developed methodology, we show that in 4989 communities from 49 different datasets, representing multiple taxa, biomes and locations, changes in richness and composition are larger than expected by DE. All the fundamental assumptions of DE are violated, but the large changes in species richness and composition primarily stem from the synchrony in the dynamics of different species. These results indicate that temporal changes in communities are predominantly driven by shared responses of co-occurring species to environmental changes, rather than by inter-specific competition. This finding is in sharp contrast to the long-term recognition of competition as a primary driver of species assembly8–12. While ecological niches are often thought to stabilize species diversity and composition4, 13, 14, we found that they promote large changes in ecological communities.


2013 ◽  
Vol 280 (1770) ◽  
pp. 20131901 ◽  
Author(s):  
Lawrence N. Hudson ◽  
Daniel C. Reuman

A major goal of ecology is to discover how dynamics and structure of multi-trophic ecological communities are related. This is difficult, because whole-community data are limited and typically comprise only a snapshot of a community instead of a time series of dynamics, and mathematical models of complex system dynamics have a large number of unmeasured parameters and therefore have been only tenuously related to real systems. These are related problems, because long time-series, if they were commonly available, would enable inference of parameters. The resulting ‘plague of parameters’ means most studies of multi-species population dynamics have been very theoretical. Dynamical models parametrized using physiological allometries may offer a partial cure for the plague of parameters, and these models are increasingly used in theoretical studies. However, physiological allometries cannot determine all parameters, and the models have also rarely been directly tested against data. We confronted a model of community dynamics with data from a lake community. Many important empirical patterns were reproducible as outcomes of dynamics, and were not reproducible when parameters did not follow physiological allometries. Results validate the usefulness, when parameters follow physiological allometries, of classic differential-equation models for understanding whole-community dynamics and the structure–dynamics relationship.


2019 ◽  
Author(s):  
Michael Kalyuzhny

AbstractAimTemporal patterns of community dynamics are drawing increasing interest due to their potential to shed light on assembly processes and anthropogenic effects. However, interpreting such patterns considerably benefits from comparing observed dynamics to the reference of a null model. For that aim, the cyclic shift permutations algorithm, which generates randomized null communities based on empirically observed time series, has recently been proposed. The use of this algorithm, which shifts each species time series randomly in time, has been justified by the claim that it preserves the temporal autocorrelation of single species. Hence it has been used to test the significance of various community patterns, in particular excessive compositional changes, biodiversity trends and community stability.InnovationHere we critically study the properties of the cyclic shift algorithm for the first time. We show that, unlike previously suggested, this algorithm does not preserve temporal autocorrelation due to the need to “wrap” the time series and assign the last observations to the first years. Moreover, this algorithm scrambles the initial state of the community, making any dynamics that results from deviations from equilibrium seem excessive. We exemplify that these two issues lead to a highly elevated type I error rate in tests for excessive compositional changes and richness trends.ConclusionsCaution is needed when using the cyclic shift permutation algorithm and interpreting results obtained using it. Interpretation is further complicated because the algorithm removes all correlations between species. We suggest guidelines for using this method and discuss several possible alternative approaches. More research is needed on the best practices for using null models for temporal patterns.


2008 ◽  
Vol 20 (5) ◽  
pp. 1211-1238 ◽  
Author(s):  
Gaby Schneider

Oscillatory correlograms are widely used to study neuronal activity that shows a joint periodic rhythm. In most cases, the statistical analysis of cross-correlation histograms (CCH) features is based on the null model of independent processes, and the resulting conclusions about the underlying processes remain qualitative. Therefore, we propose a spike train model for synchronous oscillatory firing activity that directly links characteristics of the CCH to parameters of the underlying processes. The model focuses particularly on asymmetric central peaks, which differ in slope and width on the two sides. Asymmetric peaks can be associated with phase offsets in the (sub-) millisecond range. These spatiotemporal firing patterns can be highly consistent across units yet invisible in the underlying processes. The proposed model includes a single temporal parameter that accounts for this peak asymmetry. The model provides approaches for the analysis of oscillatory correlograms, taking into account dependencies and nonstationarities in the underlying processes. In particular, the auto- and the cross-correlogram can be investigated in a joint analysis because they depend on the same spike train parameters. Particular temporal interactions such as the degree to which different units synchronize in a common oscillatory rhythm can also be investigated. The analysis is demonstrated by application to a simulated data set.


Paleobiology ◽  
2001 ◽  
Vol 27 (4) ◽  
pp. 602-630 ◽  
Author(s):  
Michael Foote

Apparent variation in rates of origination and extinction reflects the true temporal pattern of taxonomic rates as well as the distorting effects of incomplete and variable preservation, effects that are themselves exacerbated by true variation in taxonomic rates. Here I present an approach that can undo these distortions and thus permit estimates of true taxonomic rates, while providing estimates of preservation in the process. Standard survivorship probabilities are modified to incorporate variable taxonomic rates and rates of fossil recovery. Time series of these rates are explored by numerical optimization until the set of rates that best explains the observed data is found. If internal occurrences within stratigraphic ranges are available, or if temporal patterns of fossil recovery can otherwise be assumed, these constraints can be exploited, but they are by no means necessary. In its most general form, the approach requires no data other than first and last appearances. When tested against simulated data, the method is able to recover temporal patterns in rates of origination, extinction, and preservation. With empirical data, it yields estimates of preservation rate that agree with those obtained independently by tabulating internal occurrences within stratigraphic ranges. Moreover, when empirical occurrence data are artificially degraded, the method detects the resulting gaps in sampling and corrects taxonomic rates. Preliminary application to data on Paleozoic marine animals suggests that some features of the apparent record, such as the forward smearing of true origination events and the backward smearing of true extinction events, can be detected and corrected. Other features, such as the end-Ordovician extinction, may be fairly accurate at face value.


2012 ◽  
Vol 8 (1) ◽  
pp. 89-115 ◽  
Author(s):  
V. K. C. Venema ◽  
O. Mestre ◽  
E. Aguilar ◽  
I. Auer ◽  
J. A. Guijarro ◽  
...  

Abstract. The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.


Author(s):  
Giuseppe Fabio Ceschini ◽  
Nicolò Gatta ◽  
Mauro Venturini ◽  
Thomas Hubauer ◽  
Alin Murarasu

Statistical parametric methodologies are widely employed in the analysis of time series of gas turbine sensor readings. These methodologies identify outliers as a consequence of excessive deviation from a statistically-based model, derived from available observations. Among parametric techniques, the k-σ methodology demonstrates its effectiveness in the analysis of stationary time series. Furthermore, the simplicity and the clarity of this approach justify its direct application to industry. On the other hand, the k-σ methodology usually proves to be unable to adapt to dynamic time series, since it identifies observations in a transient as outliers. As this limitation is caused by the nature of the methodology itself, two improved approaches are considered in this paper in addition to the standard k-σ methodology. The two proposed methodologies maintain the same rejection rule of the standard k-σ methodology, but differ in the portions of the time series from which statistical parameters (mean and standard deviation) are inferred. The first approach performs statistical inference by considering all observations prior to the current one, which are assumed reliable, plus a forward window containing a specified number of future observations. The second approach proposed in this paper is based on a moving window scheme. Simulated data are used to tune the parameters of the proposed improved methodologies and to prove their effectiveness in adapting to dynamic time series. The moving window approach is found to be the best on simulated data in terms of True Positive Rate (TPR), False Negative Rate (FNR) and False Positive Rate (FPR). Therefore, the performance of the moving window approach is further assessed towards both different simulated scenarios and field data taken on a gas turbine.


2018 ◽  
Vol 15 (147) ◽  
pp. 20180695 ◽  
Author(s):  
Simone Cenci ◽  
Serguei Saavedra

Biotic interactions are expected to play a major role in shaping the dynamics of ecological systems. Yet, quantifying the effects of biotic interactions has been challenging due to a lack of appropriate methods to extract accurate measurements of interaction parameters from experimental data. One of the main limitations of existing methods is that the parameters inferred from noisy, sparsely sampled, nonlinear data are seldom uniquely identifiable. That is, many different parameters can be compatible with the same dataset and can generalize to independent data equally well. Hence, it is difficult to justify conclusive assertions about the effect of biotic interactions without information about their associated uncertainty. Here, we develop an ensemble method based on model averaging to quantify the uncertainty associated with the effect of biotic interactions on community dynamics from non-equilibrium ecological time-series data. Our method is able to detect the most informative time intervals for each biotic interaction within a multivariate time series and can be easily adapted to different regression schemes. Overall, this novel approach can be used to associate a time-dependent uncertainty with the effect of biotic interactions. Moreover, because we quantify uncertainty with minimal assumptions about the data-generating process, our approach can be applied to any data for which interactions among variables strongly affect the overall dynamics of the system.


Author(s):  
Dr. Maysoon M. Aziz, Et. al.

In this paper, we will use the differential equations of the SIR model as a non-linear system, by using the Runge-Kutta numerical method to calculate simulated values for known epidemiological diseases related to the time series including the epidemic disease COVID-19, to obtain hypothetical results and compare them with the dailyreal statisticals of the disease for counties of the world and to know the behavior of this disease through mathematical applications, in terms of stability as well as chaos in many applied methods. The simulated data was obtained by using Matlab programms, and compared between real data and simulated datd were well compatible and with a degree of closeness. we took the data for Italy as an application.  The results shows that this disease is unstable, dissipative and chaotic, and the Kcorr of it equal (0.9621), ,also the power spectrum system was used as an indicator to clarify the chaos of the disease, these proves that it is a spread,outbreaks,chaotic and epidemic disease .


Sign in / Sign up

Export Citation Format

Share Document