Improving Data Quality for Environmental Fate Models:  A Least-Squares Adjustment Procedure for Harmonizing Physicochemical Properties of Organic Compounds

2005 ◽  
Vol 39 (21) ◽  
pp. 8434-8441 ◽  
Author(s):  
Urs Schenker ◽  
Matthew MacLeod ◽  
Martin Scheringer ◽  
Konrad Hungerbühler
2013 ◽  
Vol 726-731 ◽  
pp. 175-178
Author(s):  
Zhi Min Cao ◽  
Zhen Zhen Wu ◽  
Zhi Fen Lin

There is an essential need to use computation-based quantitative structureactivity relationship (QSAR) modeling for providing information about the physicochemical properties of chemicals and their environmental fate as well as their human health effects. The major aims of this paper is to explore ways to predict and to identify hazardous combinations of chemicals relevant to humans and the environment. So we use QSAR modeling for toxicological predictions determine the potential adverse effects of reactive organic compounds in risk assessment.


2021 ◽  
Vol 5 (1) ◽  
pp. 59
Author(s):  
Gaël Kermarrec ◽  
Niklas Schild ◽  
Jan Hartmann

Terrestrial laser scanners (TLS) capture a large number of 3D points rapidly, with high precision and spatial resolution. These scanners are used for applications as diverse as modeling architectural or engineering structures, but also high-resolution mapping of terrain. The noise of the observations cannot be assumed to be strictly corresponding to white noise: besides being heteroscedastic, correlations between observations are likely to appear due to the high scanning rate. Unfortunately, if the variance can sometimes be modeled based on physical or empirical considerations, the latter are more often neglected. Trustworthy knowledge is, however, mandatory to avoid the overestimation of the precision of the point cloud and, potentially, the non-detection of deformation between scans recorded at different epochs using statistical testing strategies. The TLS point clouds can be approximated with parametric surfaces, such as planes, using the Gauss–Helmert model, or the newly introduced T-splines surfaces. In both cases, the goal is to minimize the squared distance between the observations and the approximated surfaces in order to estimate parameters, such as normal vector or control points. In this contribution, we will show how the residuals of the surface approximation can be used to derive the correlation structure of the noise of the observations. We will estimate the correlation parameters using the Whittle maximum likelihood and use comparable simulations and real data to validate our methodology. Using the least-squares adjustment as a “filter of the geometry” paves the way for the determination of a correlation model for many sensors recording 3D point clouds.


2018 ◽  
Vol 10 (1) ◽  
Author(s):  
Kamel Mansouri ◽  
Chris M. Grulke ◽  
Richard S. Judson ◽  
Antony J. Williams

Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8276
Author(s):  
Víctor Puente ◽  
Marta Folgueira

Very long baseline interferometry (VLBI) is the only technique in space geodesy that can determine directly the celestial pole offsets (CPO). In this paper, we make use of the CPO derived from global VLBI solutions to estimate empirical corrections to the main lunisolar nutation terms included in the IAU 2006/2000A precession–nutation model. In particular, we pay attention to two factors that affect the estimation of such corrections: the celestial reference frame used in the production of the global VLBI solutions and the stochastic model employed in the least-squares adjustment of the corrections. In both cases, we have found that the choice of these aspects has an effect of a few μas in the estimated corrections.


Sign in / Sign up

Export Citation Format

Share Document