scholarly journals State Change Probability: A Measure of the Complexity of Cardiac RR Interval Time Series Using Physiological State Change with Statistical Hypothesis Testing

2019 ◽  
Author(s):  
Hsuan-Hao Chao ◽  
Han-Ping Huang ◽  
Sung-Yang Wei ◽  
Chang Francis Hsu ◽  
Long Hsu ◽  
...  

AbstractThe complexity of biological signals has been proposed to reflect the adaptability of a given biological system to different environments. Two measures of complexity—multiscale entropy (MSE) and entropy of entropy (EoE)—have been proposed, to evaluate the complexity of heart rate signals from different perspectives. The MSE evaluates the information content of a long time series across multiple temporal scales, while the EoE characterizes variation in amount of information, which is interpreted as the “state changing,” of segments in a time series. However, both are problematic when analyzing white noise and are sensitive to data size. Therefore, based on the concept of “state changing,” we propose state change probability (SCP) as a measure of complexity. SCP utilizes a statistical hypothesis test to determine the physiological state changes between two consecutive segments in heart rate signals. The SCP value is defined as the ratio of the number of state changes to total number of consecutive segment pairs. Two common statistical tests, the t-test and Wilcoxon rank–sum test, were separately used in the SCP algorithm for comparison, yielding similar results. The SCP method is capable of reasonably evaluating the complexity of white noise and other signals, including 1/f noise, periodic signals, and heart rate signals, from healthy subjects, as well as subjects with congestive heart failure or atrial fibrillation. The SCP method is also insensitive to data size. A universal SCP threshold value can be applied, to differentiate between healthy and pathological subjects for data sizes ranging from 100 to 10,000 points. The SCP algorithm is slightly better than the EoE method when differentiating between subjects, and is superior to the MSE method.

Entropy ◽  
2019 ◽  
Vol 21 (10) ◽  
pp. 1024 ◽  
Author(s):  
Estelle Blons ◽  
Laurent Arsac ◽  
Pierre Gilfriche ◽  
Veronique Deschodt-Arsac

In humans, physiological systems involved in maintaining stable conditions for health and well-being are complex, encompassing multiple interactions within and between system components. This complexity is mirrored in the temporal structure of the variability of output signals. Entropy has been recognized as a good marker of systems complexity, notably when calculated from heart rate and postural dynamics. A degraded entropy is generally associated with frailty, aging, impairments or diseases. In contrast, high entropy has been associated with the elevated capacity to adjust to an ever-changing environment, but the link is unknown between entropy and the capacity to cope with cognitive tasks in a healthy young to middle-aged population. Here, we addressed classic markers (time and frequency domains) and refined composite multiscale entropy (MSE) markers (after pre-processing) of heart rate and postural sway time series in 34 participants during quiet versus cognitive task conditions. Recordings lasted 10 min for heart rate and 51.2 s for upright standing, providing time series lengths of 500–600 and 2048 samples, respectively. The main finding was that entropy increased during cognitive tasks. This highlights the possible links between our entropy measures and the systems complexity that probably facilitates a control remodeling and a flexible adaptability in our healthy participants. We conclude that entropy is a reliable marker of neurophysiological complexity and adaptability in autonomic and somatic systems.


2014 ◽  
Vol 11 (91) ◽  
pp. 20130585 ◽  
Author(s):  
Bernard Cazelles ◽  
Kévin Cazelles ◽  
Mario Chavez

Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the ‘beta-surrogate’ method.


2014 ◽  
Vol 4 (1) ◽  
Author(s):  
R. Lehmann

AbstractGeodetic and geophysical time series may contain sinusoidal oscillations of unknown angular frequency. Often it is required to decide if such sinusoidal oscillations are truly present in a given time series. Here we pose the decision problem as a statistical hypothesis test, an approach very popular in geodesy and other scientific disciplines. In the case of unknown angular frequencies such a test has not yet been proposed.We restrict ourselves to the detection of a single sinusoidal oscillation in a one-dimensional time series, sampled at non-uniform time intervals.We compare two solution methods: the likelihood ratio test for parameters in a Gauss-Markov model and the analysis of the Lomb-Scargle periodogram. Whenever needed, critical values of these tests are computed using the Monte Carlo method. We analyze an exemplary time series from an absolute gravimetric observation by various tests. Finally, we compare their statistical power. It is found that the results for the exemplary time series are comparable. The LR test is more flexible, but always requires the Monte Carlo method for the computation of critical values. The periodogram analysis is computationally faster, because critical values can be approximately deduced from the exponential distribution, at least if the sampling is nearly uniform.


Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 663
Author(s):  
Pierre Bouny ◽  
Laurent M. Arsac ◽  
Emma Touré Touré Cuq ◽  
Veronique Deschodt-Arsac

Recent research has clarified the existence of a networked system involving a cortical and subcortical circuitry regulating both cognition and cardiac autonomic control, which is dynamically organized as a function of cognitive demand. The main interactions span multiple temporal and spatial scales and are extensively governed by nonlinear processes. Hence, entropy and (multi)fractality in heart period time series are suitable to capture emergent behavior of the cognitive-autonomic network coordination. This study investigated how entropy and multifractal-multiscale analyses could depict specific cognitive-autonomic architectures reflected in the heart rate dynamics when students performed selective inhibition tasks. The participants () completed cognitive interference (Stroop color and word task), action cancellation (stop-signal) and action restraint (go/no-go) tasks, compared to watching a neutral movie as baseline. Entropy and fractal markers (respectively, the refined composite multiscale entropy and multifractal-multiscale detrended fluctuation analysis) outperformed other time-domain and frequency-domain markers of the heart rate variability in distinguishing cognitive tasks. Crucially, the entropy increased selectively during cognitive interference and the multifractality increased during action cancellation. An interpretative hypothesis is that cognitive interference elicited a greater richness in interactive processes that form the central autonomic network while action cancellation, which is achieved via biasing a sensorimotor network, could lead to a scale-specific heightening of multifractal behavior.


Author(s):  
Hozan Khalid Hamarashid

The mean result of machine learning models is determined by utilizing k-fold cross-validation. The algorithm with the best average performance should surpass those with the poorest. But what if the difference in average outcomes is the consequence of a statistical anomaly? To conduct whether or not the mean result differences between two algorithms is genuine then statistical hypothesis test is utilized. Using statistical hypothesis testing, this study will demonstrate how to compare machine learning algorithms. The output of several machine learning algorithms or simulation pipelines is compared during model selection. The model that performs the best based on your performance measure becomes the last model, which can be utilized to make predictions on new data. With classification and regression prediction models it can be conducted by utilizing traditional machine learning and deep learning methods. The difficulty is to identify whether or not the difference between two models is accurate.


2022 ◽  
Author(s):  
Haoyu Wen ◽  
Hong-Jia Chen ◽  
Chien-Chih Chen ◽  
Massimo Pica Ciamarra ◽  
Siew Ann Cheong

Abstract. Geoelectric time series (TS) has long been studied for its potential for probabilistic earthquake forecasting, and a recent model (GEMSTIP) directly used the skewness and kurtosis of geoelectric TS to provide Time of Increased Probabilities (TIPs) for earthquakes in several months in future. We followed up on this work by applying the Hidden Markov Model (HMM) on the correlation, variance, skewness, and kurtosis TSs to identify two Hidden States (HSs) with different distributions of these statistical indexes. More importantly, we tested whether these HSs could separate time periods into times of higher/lower earthquake probabilities. Using 0.5-Hz geoelectric TS data from 20 stations across Taiwan over 7 years, we first computed the statistical index TSs, and then applied the Baum-Welch Algorithm with multiple random initializations to obtain a well-converged HMM and its HS TS for each station. We then divided the map of Taiwan into a 16-by-16 grid map and quantified the forecasting skill, i.e., how well the HS TS could separate times of higher/lower earthquake probabilities in each cell in terms of a discrimination power measure that we defined. Next, we compare the discrimination power of empirical HS TSs against those of 400 simulated HS TSs, then organized the statistical significance values from these cellular-level hypothesis testing of the forecasting skill obtained into grid maps of discrimination reliability. Having found such significance values to be high for many grid cells for all stations, we proceeded with a statistical hypothesis test of the forecasting skill at the global level, to find high statistical significance across large parts of the hyperparameter spaces of most stations. We therefore concluded that geoelectric TSs indeed contain earthquake-related information, and the HMM approach to be capable at extracting this information for earthquake forecasting.


Entropy ◽  
2018 ◽  
Vol 20 (12) ◽  
pp. 952 ◽  
Author(s):  
Dae-Young Lee ◽  
Young-Seok Choi

Electrocardiogram (ECG) signal has been commonly used to analyze the complexity of heart rate variability (HRV). For this, various entropy methods have been considerably of interest. The multiscale entropy (MSE) method, which makes use of the sample entropy (SampEn) calculation of coarse-grained time series, has attracted attention for analysis of HRV. However, the SampEn computation may fail to be defined when the length of a time series is not enough long. Recently, distribution entropy (DistEn) with improved stability for a short-term time series has been proposed. Here, we propose a novel multiscale DistEn (MDE) for analysis of the complexity of short-term HRV by utilizing a moving-averaging multiscale process and the DistEn computation of each moving-averaged time series. Thus, it provides an improved stability of entropy evaluation for short-term HRV extracted from ECG. To verify the performance of MDE, we employ the analysis of synthetic signals and confirm the superiority of MDE over MSE. Then, we evaluate the complexity of short-term HRV extracted from ECG signals of congestive heart failure (CHF) patients and healthy subjects. The experimental results exhibit that MDE is capable of quantifying the decreased complexity of HRV with aging and CHF disease with short-term HRV time series.


2011 ◽  
Vol 23 (1) ◽  
pp. 97-123 ◽  
Author(s):  
Arta A. Jamshidi ◽  
Michael J. Kirby

We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters—in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.


2019 ◽  
Vol 19 (2) ◽  
pp. 134-140
Author(s):  
Baek-Ju Sung ◽  
Sung-kyu Lee ◽  
Mu-Seong Chang ◽  
Do-Sik Kim

Sign in / Sign up

Export Citation Format

Share Document