scholarly journals Ensemble Neural Networks for Modeling DEM Error

2019 ◽  
Vol 8 (10) ◽  
pp. 444 ◽  
Author(s):  
Nguyen ◽  
Starek ◽  
Tissot ◽  
Cai ◽  
Gibeaut

Digital elevation models (DEMs) have become ubiquitous and remarkably effective in the field of earth sciences as a tool to characterize surface topography. All DEMs have a degree of inherent error and uncertainty that is propagated to subsequent models and analyses, which can lead to misinterpretation and inaccurate estimates. A new method was developed to estimate local DEM errors and implement corrections while quantifying the uncertainties of the implemented corrections. The method is based on the flexibility and ability to model complex problems with ensemble neural networks (ENNs). The method was developed to be applied to any DEM created from a corresponding set of elevation points (point cloud) and a set of ground truth measurements. The method was developed and tested using hyperspatial resolution terrestrial laser scanning (TLS) data (sub-centimeter point spacing) collected from a marsh site located along the southern portion of the Texas Gulf Coast, USA. ENNs improve the overall DEM accuracy in the study area by 68% for six model inputs and by 75% for 12 model inputs corresponding to root mean square errors (RMSEs) of 0.056 and 0.045 m, respectively. The 12-input model provides more accurate tolerance interval estimates, particularly for vegetated areas. The accuracy of the method is confirmed based on an independent data set. Although the method still underestimates the 95% tolerance interval, 8% below the 95% target, results show that it is able to quantify the spatial variability in uncertainties due to a relationship between vegetation/land cover and accuracy of the DEM for the study area. There are still opportunities and challenges in improving and confirming the applicability of this method for different study sites and data sets.

Author(s):  
Rui Xu ◽  
Donald C. Wunsch II

To classify objects based on their features and characteristics is one of the most important and primitive activities of human beings. The task becomes even more challenging when there is no ground truth available. Cluster analysis allows new opportunities in exploring the unknown nature of data through its aim to separate a finite data set, with little or no prior information, into a finite and discrete set of “natural,” hidden data structures. Here, the authors introduce and discuss clustering algorithms that are related to machine learning and computational intelligence, particularly those based on neural networks. Neural networks are well known for their good learning capabilities, adaptation, ease of implementation, parallelization, speed, and flexibility, and they have demonstrated many successful applications in cluster analysis. The applications of cluster analysis in real world problems are also illustrated. Portions of the chapter are taken from Xu and Wunsch (2008).


Healthcare ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 181 ◽  
Author(s):  
Patricia Melin ◽  
Julio Cesar Monica ◽  
Daniela Sanchez ◽  
Oscar Castillo

In this paper, a multiple ensemble neural network model with fuzzy response aggregation for the COVID-19 time series is presented. Ensemble neural networks are composed of a set of modules, which are used to produce several predictions under different conditions. The modules are simple neural networks. Fuzzy logic is then used to aggregate the responses of several predictor modules, in this way, improving the final prediction by combining the outputs of the modules in an intelligent way. Fuzzy logic handles the uncertainty in the process of making a final decision about the prediction. The complete model was tested for the case of predicting the COVID-19 time series in Mexico, at the level of the states and the whole country. The simulation results of the multiple ensemble neural network models with fuzzy response integration show very good predicted values in the validation data set. In fact, the prediction errors of the multiple ensemble neural networks are significantly lower than using traditional monolithic neural networks, in this way showing the advantages of the proposed approach.


2020 ◽  
Author(s):  
Tuomas Yrttimaa ◽  
Ninni Saarinen ◽  
Ville Luoma ◽  
Topi Tanhuanpää ◽  
Ville Kankare ◽  
...  

The feasibility of terrestrial laser scanning (TLS) in characterizing standing trees has been frequently investigated, while less effort has been put in quantifying downed dead wood using TLS. To advance dead wood characterization using TLS, we collected TLS point clouds and downed dead wood information from 20 sample plots (32 m x 32 m in size) located in southern Finland. This data set can be used in developing new algorithms for downed dead wood detection and characterization as well as for understanding spatial patterns of downed dead wood in boreal forests.


Author(s):  
Oskar Allerbo ◽  
Rebecka Jörnsten

AbstractNon-parametric, additive models are able to capture complex data dependencies in a flexible, yet interpretable way. However, choosing the format of the additive components often requires non-trivial data exploration. Here, as an alternative, we propose PrAda-net, a one-hidden-layer neural network, trained with proximal gradient descent and adaptive lasso. PrAda-net automatically adjusts the size and architecture of the neural network to reflect the complexity and structure of the data. The compact network obtained by PrAda-net can be translated to additive model components, making it suitable for non-parametric statistical modelling with automatic model selection. We demonstrate PrAda-net on simulated data, where we compare the test error performance, variable importance and variable subset identification properties of PrAda-net to other lasso-based regularization approaches for neural networks. We also apply PrAda-net to the massive U.K. black smoke data set, to demonstrate how PrAda-net can be used to model complex and heterogeneous data with spatial and temporal components. In contrast to classical, statistical non-parametric approaches, PrAda-net requires no preliminary modeling to select the functional forms of the additive components, yet still results in an interpretable model representation.


2020 ◽  
Vol 9 (11) ◽  
pp. e7509119806
Author(s):  
Leandro Soares Santos ◽  
Moysés Naves de Moraes ◽  
Julia Dos Santos Lopes ◽  
Luciana Carolina Bauer ◽  
Paulo Bonomo ◽  
...  

Thermophysical properties are important in design, simulation, optimization, and control of food processing. Its prediction is very important but theoretical basis is difficult and empirical models were commonly used. In this work, the modeling of neural networks was applied as an alternative to predict density, thermal conductivity and thermal diffusivity from the temperature and moisture content of jackfruit, genipap and umbu. Data sets from literature were used, combined and individually, to obtain four networks. Supervised multilayer perceptron networks were developed, using the back-propagation algorithm. Several configurations of artificial neural networks (ANNs) were evaluated with one or two hidden layers and a maximum of 21 and 12 neurons in each one, respectively. Data sets were divided to learning (60%) and verification (40%) steps. Best ANNs were chosen based on correlation coefficient and root mean square errors (RMSE), and compared with polynomial models using average absolute deviations (AADs). From total disposable data set, the best ANN developed presents one hidden layer with 15 neurons and shows the same predictive ability of ANNs created from individual fruits data sets, presenting close RMSE and correlation coefficient. The ANNs developed presents AADs near to polynomial models and appers as alternative to conventional modeling. Results indicate that the ANN created from total data set can replace nine polynomial models to predict the thermophysical properties of jackfruit, genipap and umbu pulps.


Author(s):  
D. E. Becker

An efficient, robust, and widely-applicable technique is presented for computational synthesis of high-resolution, wide-area images of a specimen from a series of overlapping partial views. This technique can also be used to combine the results of various forms of image analysis, such as segmentation, automated cell counting, deblurring, and neuron tracing, to generate representations that are equivalent to processing the large wide-area image, rather than the individual partial views. This can be a first step towards quantitation of the higher-level tissue architecture. The computational approach overcomes mechanical limitations, such as hysterisis and backlash, of microscope stages. It also automates a procedure that is currently done manually. One application is the high-resolution visualization and/or quantitation of large batches of specimens that are much wider than the field of view of the microscope.The automated montage synthesis begins by computing a concise set of landmark points for each partial view. The type of landmarks used can vary greatly depending on the images of interest. In many cases, image analysis performed on each data set can provide useful landmarks. Even when no such “natural” landmarks are available, image processing can often provide useful landmarks.


2020 ◽  
Author(s):  
Jingbai Li ◽  
Patrick Reiser ◽  
André Eberhard ◽  
Pascal Friederich ◽  
Steven Lopez

<p>Photochemical reactions are being increasingly used to construct complex molecular architectures with mild and straightforward reaction conditions. Computational techniques are increasingly important to understand the reactivities and chemoselectivities of photochemical isomerization reactions because they offer molecular bonding information along the excited-state(s) of photodynamics. These photodynamics simulations are resource-intensive and are typically limited to 1–10 picoseconds and 1,000 trajectories due to high computational cost. Most organic photochemical reactions have excited-state lifetimes exceeding 1 picosecond, which places them outside possible computational studies. Westermeyr <i>et al.</i> demonstrated that a machine learning approach could significantly lengthen photodynamics simulation times for a model system, methylenimmonium cation (CH<sub>2</sub>NH<sub>2</sub><sup>+</sup>).</p><p>We have developed a Python-based code, Python Rapid Artificial Intelligence <i>Ab Initio</i> Molecular Dynamics (PyRAI<sup>2</sup>MD), to accomplish the unprecedented 10 ns <i>cis-trans</i> photodynamics of <i>trans</i>-hexafluoro-2-butene (CF<sub>3</sub>–CH=CH–CF<sub>3</sub>) in 3.5 days. The same simulation would take approximately 58 years with ground-truth multiconfigurational dynamics. We proposed an innovative scheme combining Wigner sampling, geometrical interpolations, and short-time quantum chemical trajectories to effectively sample the initial data, facilitating the adaptive sampling to generate an informative and data-efficient training set with 6,232 data points. Our neural networks achieved chemical accuracy (mean absolute error of 0.032 eV). Our 4,814 trajectories reproduced the S<sub>1</sub> half-life (60.5 fs), the photochemical product ratio (<i>trans</i>: <i>cis</i> = 2.3: 1), and autonomously discovered a pathway towards a carbene. The neural networks have also shown the capability of generalizing the full potential energy surface with chemically incomplete data (<i>trans</i> → <i>cis</i> but not <i>cis</i> → <i>trans</i> pathways) that may offer future automated photochemical reaction discoveries.</p>


Author(s):  
Eun-Young Mun ◽  
Anne E. Ray

Integrative data analysis (IDA) is a promising new approach in psychological research and has been well received in the field of alcohol research. This chapter provides a larger unifying research synthesis framework for IDA. Major advantages of IDA of individual participant-level data include better and more flexible ways to examine subgroups, model complex relationships, deal with methodological and clinical heterogeneity, and examine infrequently occurring behaviors. However, between-study heterogeneity in measures, designs, and samples and systematic study-level missing data are significant barriers to IDA and, more broadly, to large-scale research synthesis. Based on the authors’ experience working on the Project INTEGRATE data set, which combined individual participant-level data from 24 independent college brief alcohol intervention studies, it is also recognized that IDA investigations require a wide range of expertise and considerable resources and that some minimum standards for reporting IDA studies may be needed to improve transparency and quality of evidence.


2019 ◽  
Vol 11 (10) ◽  
pp. 1157 ◽  
Author(s):  
Jorge Fuentes-Pacheco ◽  
Juan Torres-Olivares ◽  
Edgar Roman-Rangel ◽  
Salvador Cervantes ◽  
Porfirio Juarez-Lopez ◽  
...  

Crop segmentation is an important task in Precision Agriculture, where the use of aerial robots with an on-board camera has contributed to the development of new solution alternatives. We address the problem of fig plant segmentation in top-view RGB (Red-Green-Blue) images of a crop grown under open-field difficult circumstances of complex lighting conditions and non-ideal crop maintenance practices defined by local farmers. We present a Convolutional Neural Network (CNN) with an encoder-decoder architecture that classifies each pixel as crop or non-crop using only raw colour images as input. Our approach achieves a mean accuracy of 93.85% despite the complexity of the background and a highly variable visual appearance of the leaves. We make available our CNN code to the research community, as well as the aerial image data set and a hand-made ground truth segmentation with pixel precision to facilitate the comparison among different algorithms.


Sign in / Sign up

Export Citation Format

Share Document