scholarly journals Robust calibration of hierarchical population models for heterogeneous cell populations

2019 ◽  
Author(s):  
Carolin Loos ◽  
Jan Hasenauer

AbstractCellular heterogeneity is known to have important effects on signal processing and cellular decision making. To understand these processes, multiple classes of mathematical models have been introduced. The hierarchical population model builds a novel class which allows for the mechanistic description of heterogeneity and explicitly takes into account subpopulation structures. However, this model requires a parametric distribution assumption for the cell population and, so far, only the normal distribution has been employed. Here, we incorporate alternative distribution assumptions into the model, assess their robustness against outliers and evaluate their influence on the performance of model calibration in a simulation study and a real-world application example. We found that alternative distributions provide reliable parameter estimates even in the presence of outliers, and can in fact increase the convergence of model calibration.HighlightsGeneralizes hierarchical population model to various distribution assumptionsProvides framework for efficient calibration of the hierarchical population modelSimulation study and application to experimental data reveal improved robustness and optimization performance

2000 ◽  
Vol 30 (4) ◽  
pp. 521-533 ◽  
Author(s):  
Jeffrey H Gove

This paper revisits the link between assumed diameter distributions arising from horizontal point samples and their unbiased stand-based representation through weighted distribution theory. Examples are presented, which show that the assumption of a common shared parameter set between these two distributional forms, while theoretically valid, may not be reasonable in many operational cases. Simulation results are presented, which relate the conformity (or lack thereof) in these estimates to sampling intensity per point and the underlying shape of the population diameter distribution from which the sample point was drawn. In general, larger sample sizes per point are required to yield reliable parameter estimates than are generally taken for inventory purposes. In addition, a complimentary finding suggests that the more positively skewed the underlying distribution, the more trees per point are required for good parameter estimates.


1988 ◽  
Vol 64 (2) ◽  
pp. 823-831 ◽  
Author(s):  
H. L. Dorkin ◽  
K. R. Lutchen ◽  
A. C. Jackson

Recent studies on respiratory impedance (Zrs) have predicted that at frequencies greater than 64 Hz a second resonance will occur. Furthermore, if one intends to fit a model more complicated than the simple series combination of a resistance, inertance, and compliance to Zrs data, the only way to ensure statistically reliable parameter estimates is to include data surrounding this second resonance. An additional question, however, is whether the resulting parameters are physiologically meaningful. We obtained input impedance data from eight healthy adult humans using discrete frequency forced oscillations from 4 to 200 Hz. Three resonant frequencies were seen: 8 +/- 2, 151 +/- 10, and 182 +/- 16 Hz. A seven-parameter lumped element model provided an excellent fit to the data in all subjects. This model consists of an airway resistance (Raw), which is linearly dependent on frequency, and airway inertance separated from a tissue resistance, inertance, and compliance by a shunt compliance (Cg) thought to represent gas compressibility. Model estimates of Raw and Cg were compared with those suggested by measurement of Raw and thoracic gas volume using a plethysmograph. In all subjects the model Raw and Cg were significantly lower than and not correlated with the corresponding plethysmographic measurement. We hypothesize that the statistically reliable but physiologically inconsistent parameters are a consequence of the distorting influence of airway wall compliance and/or airway quarter-wave resonance. Such factors are not inherent to the seven-parameter model.


2019 ◽  
Author(s):  
Mark D. Scheuerell ◽  
Casey P. Ruff ◽  
Joseph H. Anderson ◽  
Eric M. Beamer

SummaryAssessing the degree to which at-risk species are regulated by density dependent versus density independent factors is often complicated by incomplete or biased information. If not addressed in an appropriate manner, errors in the data can affect estimates of population demographics, which may obfuscate the anticipated response of the population to a specific action.We developed a Bayesian integrated population model that accounts explicitly for interannual variability in the number of reproducing adults and their age structure, harvest, and environmental conditions. We apply the model to 41 years of data for a population of threatened steelhead troutOncorhynchus mykissusing freshwater flows, ocean indices, and releases of hatchery-born conspecifics as covariates.We found compelling evidence that the population is under strong density dependence, despite being well below its historical population size. In the freshwater portion of the lifecycle, we found a negative relationship between productivity (offspring per parent) and peak winter flows, and a positive relationship with summer flows. We also found a negative relationship between productivity and releases of hatchery conspecifics. In the marine portion of the lifecycle, we found a positive correlation between productivity and the North Pacific Gyre Oscillation. Furthermore, harvest rates on wild fish have been sufficiently low to ensure very little risk of overfishing.Synthesis and applications.The evidence for density dependent population regulation, combined with the substantial loss of juvenile rearing habitat in this river basin, suggests that habitat restoration could benefit this population of at-risk steelhead. Our results also imply that hatchery programs for steelhead need to be considered carefully with respect to habitat availability and recovery goals for wild steelhead. If releases of hatchery steelhead have indeed limited the production potential of wild steelhead, there are likely significant tradeoffs between providing harvest opportunities via hatchery steelhead production, and achieving wild steelhead recovery goals.


2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Li Tang ◽  
Xia Luo ◽  
Yang Cheng ◽  
Fei Yang ◽  
Bin Ran

The stated choice (SC) experiment has been generally regarded as an effective method for behavior analysis. Among all the SC experimental design methods, the orthogonal design has been most widely used since it is easy to understand and construct. However, in recent years, a stream of research has put emphasis on the so-called efficient experimental designs rather than keeping the orthogonality of the experiment, as the former is capable of producing more efficient data in the sense that more reliable parameter estimates can be achieved with an equal or lower sample size. This paper provides two state-of-the-art methods called optimal orthogonal choice (OOC) andD-efficient design. More statistically efficient data is expected to be obtained by either maximizing attribute level differences, or minimizing theD-error, a statistic corresponding to the asymptotic variance-covariance (AVC) matrix of the discrete choice model, when using these two methods, respectively. Since comparison and validation in the field of these methods are rarely seen, an empirical study is presented.D-error is chosen as the measure of efficiency. The result shows that both OOC andD-efficient design are more efficient. At last, strength and weakness of orthogonal, OOC, andD-efficient design are summarized.


Author(s):  
Carl Ehrett ◽  
D. Andrew Brown ◽  
Christopher Kitchens ◽  
Xinyue Xu ◽  
Roland Platz ◽  
...  

Abstract Calibration of computer models and the use of those models for design are two activities traditionally carried out separately. This paper generalizes existing Bayesian inverse analysis approaches for computer model calibration to present a methodology combining calibration and design in a unified Bayesian framework. This provides a computationally efficient means to undertake both tasks while quantifying all relevant sources of uncertainty. Specifically, compared with the traditional approach of design using parameter estimates from previously completed model calibration, this generalized framework inherently includes uncertainty from the calibration process in the design procedure. We demonstrate our approach on the design of a vibration isolation system. We also demonstrate how, when adaptive sampling of the phenomenon of interest is possible, the proposed framework may select new sampling locations using both available real observations and the computer model. This is especially useful when a misspecified model fails to reflect that the calibration parameter is functionally dependent upon the design inputs to be optimized.


2019 ◽  
Author(s):  
Leili Tapak ◽  
Omid Hamidi ◽  
Majid Sadeghifar ◽  
Hassan Doosti ◽  
Ghobad Moradi

Abstract Objectives Zero-inflated proportion or rate data nested in clusters due to the sampling structure can be found in many disciplines. Sometimes, the rate response may not be observed for some study units because of some limitations (false negative) like failure in recording data and the zeros are observed instead of the actual value of the rate/proportions (low incidence). In this study, we proposed a multilevel zero-inflated censored Beta regression model that can address zero-inflation rate data with low incidence.Methods We assumed that the random effects are independent and normally distributed. The performance of the proposed approach was evaluated by application on a three level real data set and a simulation study. We applied the proposed model to analyze brucellosis diagnosis rate data and investigate the effects of climatic and geographical position. For comparison, we also applied the standard zero-inflated censored Beta regression model that does not account for correlation.Results Results showed the proposed model performed better than zero-inflated censored Beta based on AIC criterion. Height (p-value <0.0001), temperature (p-value <0.0001) and precipitation (p-value = 0.0006) significantly affected brucellosis rates. While, precipitation in ZICBETA model was not statistically significant (p-value =0.385). Simulation study also showed that the estimations obtained by maximum likelihood approach had reasonable in terms of mean square error.Conclusions The results showed that the proposed method can capture the correlations in the real data set and yields accurate parameter estimates.


2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Kwangbom Choi ◽  
Yang Chen ◽  
Daniel A. Skelly ◽  
Gary A. Churchill

Abstract Background Single-cell RNA sequencing is a powerful tool for characterizing cellular heterogeneity in gene expression. However, high variability and a large number of zero counts present challenges for analysis and interpretation. There is substantial controversy over the origins and proper treatment of zeros and no consensus on whether zero-inflated count distributions are necessary or even useful. While some studies assume the existence of zero inflation due to technical artifacts and attempt to impute the missing information, other recent studies argue that there is no zero inflation in scRNA-seq data. Results We apply a Bayesian model selection approach to unambiguously demonstrate zero inflation in multiple biologically realistic scRNA-seq datasets. We show that the primary causes of zero inflation are not technical but rather biological in nature. We also demonstrate that parameter estimates from the zero-inflated negative binomial distribution are an unreliable indicator of zero inflation. Conclusions Despite the existence of zero inflation in scRNA-seq counts, we recommend the generalized linear model with negative binomial count distribution, not zero-inflated, as a suitable reference model for scRNA-seq analysis.


2020 ◽  
Author(s):  
Agus Hartoyo ◽  
Peter J. Cadusch ◽  
David T. J. Liley ◽  
Damien G. Hicks

AbstractAlpha blocking, a phenomenon where the alpha rhythm is reduced by attention to a visual, auditory, tactile or cognitive stimulus, is one of the most prominent features of human electroencephalography (EEG) signals. Here we identify a simple physiological mechanism by which opening of the eyes causes attenuation of the alpha rhythm. We fit a neural population model to EEG spectra from 82 subjects, each showing different degrees of alpha blocking upon opening of their eyes. Although it is notoriously difficult to estimate parameters from fitting such models, we show that, by regularizing the differences in parameter estimates between eyes-closed and eyes-open states, we can reduce the uncertainties in these differences without significantly compromising fit quality. From this emerges a parsimonious explanation for the spectral changes between states: Just a single parameter, pei, corresponding to the strength of a tonic, excitatory input to the inhibitory population, is sufficient to explain the reduction in alpha rhythm upon opening of the eyes. When comparing parameter estimates across different subjects we find that the inferred differential change in pei for each subject increases monotonically with the degree of alpha blocking observed. In contrast, other parameters show weak or negligible differential changes that do not scale with the degree of alpha attenuation in each subject. Thus most of the variation in alpha blocking across subjects can be attributed to the strength of a tonic afferent signal to the inhibitory cortical population.Author summaryOne of the most striking features of the human electroencephalogram (EEG) is the presence of neural oscillations in the range of 8-13 Hz. It is well known that attenuation of these alpha oscillations, a process known as alpha blocking, arises from opening of the eyes, though the cause has remained obscure. In this study we infer the mechanism underlying alpha blocking by fitting a neural population model to EEG spectra from 82 different individuals. Although such models have long held the promise of being able to relate macroscopic recordings of brain activity to microscopic neural parameters, their utility has been limited by the difficulty of inferring these parameters from fits to data. Our approach is to fit both eyes-open and eyes-closed EEG spectra together, minimizing the number of parameter changes required to transition from one spectrum to the other. Surprisingly, we find that there is just one parameter, the external input to the inhibitory neurons in cortex, that is responsible for attenuating the alpha oscillations. We demonstrate how the strength of this inhibitory input scales monotonically with the degree of alpha blocking observed over all 82 subjects.


2021 ◽  
Vol 37 (5) ◽  
pp. 851-859
Author(s):  
Sy Nguyen-Ky ◽  
Katariina Penttilä

HighlightsIndoor climate and energy model of a dairy barn is constructed and calibrated with collected data.Long-term monitoring of indoor conditions and electricity consumption greatly facilitates the model calibration process.Statistical benchmarks given by guidelines confirm the usability and reliability of the model.Abstract. This study demonstrates an application of ICE model calibration by using sensor building metrics in a naturally ventilated dairy house in a cold climate. The barn, at the time of the study, had 70 lactating cows and 30 calves with a total animal area of 1922 m2 and other auxiliary areas of 268 m2. Indoor condition data were collected by four integrated sensors inside the barn for six months, from March to August 2019. IDA ICE 4.8 SP1 simulation software was used to build and simulate the model, with calibration steps conducted first manually, then statistically. Actual weather and indoor condition data during the monitored period were used for calibration; statistical indices of the calibrated model were confirmed by the benchmarks given from ASHRAE Guideline 14-2014, IPMVP version 2016, and FEMP version 4.0 2015. The yielded result was a baseline ICE model, which can be further utilized in the study of energy conservation measures (ECMs), retrofitting feasibility, and ammonia and other contaminant gas emission mitigation. The abovementioned calibration practice and the proposals built on it open a pathway to achieve a higher level of energy efficiency for this type of livestock building. Keywords: Cold weather, Dairy farms, Model calibration, Natural ventilation.


2020 ◽  
Author(s):  
F Mansab ◽  
S Bhatti ◽  
D Goyal

ABSTRACTIntroductionThe response to COVID-19 differs from nation to nation. There are likely a number of factors one can attribute to such disparity, not least of which is differing healthcare models and approaches. Here, we examine the COVID-19 community triage pathways employed by four nations, specifically comparing the safety and efficacy of national online ‘symptom checkers’ utilised within the triage pathway.MethodsA simulation study was conducted on current, nationwide, patient-led symptom checkers from four countries (Singapore, Japan, USA and UK). 52 cases were simulated to approximate typical COVID-19 presentations (mild, moderate, severe and critical), and COVID-19 mimickers (e.g. sepsis and bacterial pneumonia). The same simulations were applied to each of the four country’s symptom checkers, and the recommendations to refer on for medical care or to stay home were recorded and compared.ResultsThe symptom checkers from Singapore and Japan advised onward healthcare contact for the majority of simulations (88% and 77% respectively). The USA and UK symptom checkers triaged 38% and 44% of cases to healthcare contact, respectively. Both the US and UK symptom checkers consistently failed to identify severe COVID-19, bacterial pneumonia and sepsis, triaging such cases to stay home.ConclusionOur results suggest that whilst ‘symptom checkers’ may be of use to the healthcare COVID-19 response, there is the potential for such patient-led assessment tools to worsen outcomes by delaying appropriate clinical assessment. The key features of the well performing symptom checkers are discussed.SUMMARYWhat is already known?The availability and use of symptom checkers are increasing.Symptom checkers are currently in use at a national level to help in the healthcare response to COVID-19.There is limited evidence to support the effectiveness or safety of symptom checkers as triage tools during a pandemic response.What does this paper add?This study compares performance of symptom checkers across different countries, revealing marked variation between national symptom checkers.The symptom checkers employed by Japan and Singapore are twice as likely to triage cases onward for clinical assessment than those of the US or UK.The US and UK symptom checkers frequently triaged simulated cases of sepsis, bacterial pneumonia and severe COVID-19 to stay home with no further healthcare contact.We discuss the key aspects of the well-performing triage systems.


Sign in / Sign up

Export Citation Format

Share Document