Comparison of the General Threshold Model of Survival and Dose–Response Models in Simulating the Acute Toxicity of Metals to Danio rerio

2019 ◽  
Vol 38 (10) ◽  
pp. 2169-2177
Author(s):  
An He ◽  
Xinyong Liu ◽  
Liang Qu ◽  
Yongfei Gao ◽  
Jianfeng Feng ◽  
...  
2017 ◽  
Vol 57 (1) ◽  
pp. 17-29 ◽  
Author(s):  
Helmut Schöllnberger ◽  
Markus Eidemüller ◽  
Harry M. Cullings ◽  
Cristoforo Simonetto ◽  
Frauke Neff ◽  
...  

Abstract The scientific community faces important discussions on the validity of the linear no-threshold (LNT) model for radiation-associated cardiovascular diseases at low and moderate doses. In the present study, mortalities from cerebrovascular diseases (CeVD) and heart diseases from the latest data on atomic bomb survivors were analyzed. The analysis was performed with several radio-biologically motivated linear and nonlinear dose–response models. For each detrimental health outcome one set of models was identified that all fitted the data about equally well. This set was used for multi-model inference (MMI), a statistical method of superposing different models to allow risk estimates to be based on several plausible dose–response models rather than just relying on a single model of choice. MMI provides a more accurate determination of the dose response and a more comprehensive characterization of uncertainties. It was found that for CeVD, the dose–response curve from MMI is located below the linear no-threshold model at low and medium doses (0–1.4 Gy). At higher doses MMI predicts a higher risk compared to the LNT model. A sublinear dose–response was also found for heart diseases (0–3 Gy). The analyses provide no conclusive answer to the question whether there is a radiation risk below 0.75 Gy for CeVD and 2.6 Gy for heart diseases. MMI suggests that the dose–response curves for CeVD and heart diseases in the Lifespan Study are sublinear at low and moderate doses. This has relevance for radiotherapy treatment planning and for international radiation protection practices in general.


2018 ◽  
Author(s):  
Virgile Baudrot ◽  
Sandrine Charles

ABSTRACTProviding reliable environmental quality standards (EQSs) is a challenging issue in environmental risk assessment (ERA). These EQSs are derived from toxicity endpoints estimated from dose-response models to identify and characterize the environmental hazard of chemical compounds such as those released by human activities. These toxicity endpoints include the classicalx% effect/lethal concentrations at a specific timet(EC/LC(x,t)) and the new multiplication factors applied to environmental exposure profiles leading tox% effect reduction at a specific timet(MF(x,t), or denotedLP(x,t) by the EFSA). However, classical dose-response models used to estimate toxicity endpoints have some weaknesses, such as their dependency on observation time points, which are likely to differ between species (e.g., experiment duration). Furthermore, real-world exposure profiles are rarely constant over time, which makes the use of classical dose-response models difficult and compromises the derivation ofMF(x,t). When dealing with survival or immobility toxicity test data, these issues can be overcome with the use of the general unified threshold model of survival (GUTS), a toxicokinetics-toxicodynamics (TKTD) model that provides an explicit framework to analyse both time- and concentration-dependent data sets as well as obtain a mechanistic derivation ofEC/LC(x,t) andMF(x,t) regardless of x and at any time t of interest. In addition, the assessment of a risk is inherently built upon probability distributions, such that the next critical step for ERA is to characterize the uncertainties of toxicity endpoints and, consequently, those of EQSs. With this perspective, we investigated the use of a Bayesian framework to obtain the uncertainties from the calibration process and to propagate them to model predictions, includingLC(x,t) andMF(x,t) derivations. We also explored the mathematical properties ofLC(x,t) andMF(x,t) as well as the impact of different experimental designs to provide some recommendations for a robust derivation of toxicity endpoints leading to reliable EQSs: avoid computingLC(x,t) andMF(x,t) for extremexvalues (0 or 100%), where uncertainty is maximal; computeMF(x,t) after a long period of time to take depuration time into account and test survival under few correlated and uncorrelated pulses of the contaminant in terms of depuration.


Author(s):  
Nicola Orsini

Recognizing a dose–response pattern based on heterogeneous tables of contrasts is hard. Specification of a statistical model that can consider the possible dose–response data-generating mechanism, including its variation across studies, is crucial for statistical inference. The aim of this article is to increase the understanding of mixed-effects dose–response models suitable for tables of correlated estimates. One can use the command drmeta with additive (mean difference) and multiplicative (odds ratios, hazard ratios) measures of association. The postestimation command drmeta_graph greatly facilitates the visualization of predicted average and study-specific dose–response relationships. I illustrate applications of the drmeta command with regression splines in experimental and observational data based on nonlinear and random-effects data-generation mechanisms that can be encountered in health-related sciences.


Sign in / Sign up

Export Citation Format

Share Document