Accelerating Additive Design with Probabilistic Machine Learning

Author(s):  
Yiming Zhang ◽  
Sreekar Karnati ◽  
Soumya Nag ◽  
Neil Johnson ◽  
Genghis Khan ◽  
...  

Abstract Additive manufacturing (AM) has been growing rapidly to transform industrial applications. However the fundamental mechanism of AM hasn't been fully understood which resulted in low success rate of building. A remedy is to introduce surrogate modeling based on experimental dataset to assist additive design and increase design efficiency. As one of the first papers for predictive modeling of AM especially Direct Energy Deposition (DED), this paper discusses a bidirectional modeling framework and its application to multiple DED benchmark designs including (1) Forward Prediction with Cross Validation, (2) Global Sensitivity Analyses, (3) Backward Prediction and Optimization, (4) Intelligent Data Addition. Approximately 1,150 mechanical tensile test samples were extracted and tested with input variables from machine parameters, post-process, output variables from mechanical, microstructure and physical properties.

2020 ◽  
Vol 24 (1) ◽  
pp. 47-56
Author(s):  
Ove Oklevik ◽  
Grzegorz Kwiatkowski ◽  
Mona Kristin Nytun ◽  
Helene Maristuen

The quality of any economic impact assessment largely depends on the adequacy of the input variables and chosen assumptions. This article presents a direct economic impact assessment of a music festival hosted in Norway and sensitivity analyses of two study design assumptions: estimated number of attendees and chosen definition (size) of the affected area. Empirically, the article draws on a state-of-the-art framework of an economic impact analysis and uses primary data from 471 event attendees. The results show that, first, an economic impact analysis is a complex task that requires high precision in assessing different monetary flows entering and leaving the host region, and second, the study design assumptions exert a tremendous influence on the final estimation. Accordingly, the study offers a fertile agenda for local destination marketing organizers and event managers on how to conduct reliable economic impact assessments and explains which elements of such analyses are particularly important for final estimations.


2021 ◽  
pp. 002224372110329
Author(s):  
Nicolas Padilla ◽  
Eva Ascarza

The success of Customer Relationship Management (CRM) programs ultimately depends on the firm's ability to identify and leverage differences across customers — a very diffcult task when firms attempt to manage new customers, for whom only the first purchase has been observed. For those customers, the lack of repeated observations poses a structural challenge to inferring unobserved differences across them. This is what we call the “cold start” problem of CRM, whereby companies have difficulties leveraging existing data when they attempt to make inferences about customers at the beginning of their relationship. We propose a solution to the cold start problem by developing a probabilistic machine learning modeling framework that leverages the information collected at the moment of acquisition. The main aspect of the model is that it exibly captures latent dimensions that govern the behaviors observed at acquisition as well as future propensities to buy and to respond to marketing actions using deep exponential families. The model can be integrated with a variety of demand specifications and is exible enough to capture a wide range of heterogeneity structures. We validate our approach in a retail context and empirically demonstrate the model's ability at identifying high-value customers as well as those most sensitive to marketing actions, right after their first purchase.


2003 ◽  
Vol 24 (3) ◽  
pp. 214-223 ◽  
Author(s):  
Nicholas Graves ◽  
Tanya M. Nicholls ◽  
Arthur J. Morris

AbstractObjective:To model the economic costs of hospital-acquired infections (HAIs) in New Zealand, by type of HAI.Design:Monte Carlo simulation model.Setting:Auckland District Health Board Hospitals (DHBH), the largest publicly funded hospital group in New Zealand supplying secondary and tertiary services. Costs are also estimated for predicted HAIs in admissions to all hospitals in New Zealand.Patients:All adults admitted to general medical and general surgical services.Method:Data on the number of cases of HAI were combined with data on the estimated prolongation of hospital stay due to HAI to produce an estimate of the number of bed days attributable to HAI. A cost per bed day value was applied to provide an estimate of the economic cost. Costs were estimated for predicted infections of the urinary tract, surgical wounds, the lower and upper respiratory tracts, the bloodstream, and other sites, and for cases of multiple sites of infection. Sensitivity analyses were undertaken for input variables.Results:The estimated costs of predicted HAIs in medical and surgical admissions to Auckland DHBH were $10.12 (US $4.56) million and $8.64 (US $3.90) million, respectively. They were $51.35 (US $23.16) million and $85.26 (US $38.47) million, respectively, for medical and surgical admissions to all hospitals in New Zealand.Conclusions:The method used produces results that are less precise than those of a specifically designed study using primary data collection, but has been applied at a lower cost. The estimated cost of HAIs is substantial, but only a proportion of infections can be avoided. Further work is required to identify the most cost-effective strategies for the prevention of HAI.


SOIL ◽  
2016 ◽  
Vol 2 (4) ◽  
pp. 647-657 ◽  
Author(s):  
Sami Touil ◽  
Aurore Degre ◽  
Mohamed Nacer Chabaca

Abstract. Improving the accuracy of pedotransfer functions (PTFs) requires studying how prediction uncertainty can be apportioned to different sources of uncertainty in inputs. In this study, the question addressed was as follows: which variable input is the main or best complementary predictor of water retention, and at which water potential? Two approaches were adopted to generate PTFs: multiple linear regressions (MLRs) for point PTFs and multiple nonlinear regressions (MNLRs) for parametric PTFs. Reliability tests showed that point PTFs provided better estimates than parametric PTFs (root mean square error, RMSE: 0.0414 and 0.0444 cm3 cm−3, and 0.0613 and 0.0605 cm3 cm−3 at −33 and −1500 kPa, respectively). The local parametric PTFs provided better estimates than Rosetta PTFs at −33 kPa. No significant difference in accuracy, however, was found between the parametric PTFs and Rosetta H2 at −1500 kPa with RMSE values of 0.0605 cm3 cm−3 and 0.0636 cm3 cm−3, respectively. The results of global sensitivity analyses (GSAs) showed that the mathematical formalism of PTFs and their input variables reacted differently in terms of point pressure and texture. The point and parametric PTFs were sensitive mainly to the sand fraction in the fine- and medium-textural classes. The use of clay percentage (C %) and bulk density (BD) as inputs in the medium-textural class improved the estimation of PTFs at −33 kPa.


2020 ◽  
Vol 10 (2) ◽  
pp. 472 ◽  
Author(s):  
Amir Mahdiyar ◽  
Danial Jahed Armaghani ◽  
Mohammadreza Koopialipoor ◽  
Ahmadreza Hedayat ◽  
Arham Abdullah ◽  
...  

Peak particle velocity (PPV) is a critical parameter for the evaluation of the impact of blasting operations on nearby structures and buildings. Accurate estimation of the amount of PPV resulting from a blasting operation and its comparison with the allowable ranges is an integral part of blasting design. In this study, four quarry sites in Malaysia were considered, and the PPV was simulated using gene expression programming (GEP) and Monte Carlo simulation techniques. Data from 149 blasting operations were gathered, and as a result of this study, a PPV predictive model was developed using GEP to be used in the simulation. In order to ensure that all of the combinations of input variables were considered, 10,000 iterations were performed, considering the correlations among the input variables. The simulation results demonstrate that the minimum and maximum PPV amounts were 1.13 mm/s and 34.58 mm/s, respectively. Two types of sensitivity analyses were performed to determine the sensitivity of the PPV results based on the effective variables. In addition, this study proposes a method specific to the four case studies, and presents an approach which could be readily applied to similar applications with different conditions.


2011 ◽  
Vol 74 (9) ◽  
pp. 1422-1433 ◽  
Author(s):  
CHARLES C. DODD ◽  
MICHAEL W. SANDERSON ◽  
MEGAN E. JACOB ◽  
DAVID G. RENTER

Field studies evaluating the effects of multiple concurrent preharvest interventions for Escherichia coli O157 are logistically and economically challenging; however, modeling techniques may provide useful information on these effects while also identifying crucial information gaps that can guide future research. We constructed a risk assessment model with data obtained from a systematic search of scientific literature. Parameter distributions were incorporated into a stochastic Monte Carlo modeling framework to examine the impacts of different combinations of preharvest and harvest interventions for E. coli O157 on the risk of beef carcass contamination. We estimated the risk of E. coli O157 carcass contamination conditional on preharvest fecal prevalence estimates, inclusion of feed additive(s) in the diet, vaccination for E. coli O157, transport and lairage effects, hide intervention(s), and carcass intervention(s). Prevalence parameters for E. coli O157 were assumed to encompass potential effects of concentration; therefore, concentration effects were not specifically evaluated in this study. Sensitivity analyses revealed that fecal prevalence, fecal-to-hide transfer, hide-to-carcass transfer, and carcass intervention efficacy significantly affected the risk of carcass contamination (correlation coefficients of 0.37, 0.56, 0.58, and −0.29, respectively). The results indicated that combinations of preharvest interventions may be particularly important for supplementing harvest interventions during periods of higher variability in fecal shedding prevalence (i.e., summer). Further assessments of the relationships among fecal prevalence and concentration, hide contamination, and subsequent carcass contamination are needed to further define risks and intervention impacts for E. coli O157 contamination of beef.


2007 ◽  
Vol 37 (11) ◽  
pp. 2080-2089 ◽  
Author(s):  
E. Louise Loudermilk ◽  
Wendell P. Cropper

There are few remaining longleaf pine ( Pinus palustris Mill.) ecosystems left in the southeastern coastal plain of the United States. Restoration and maintenance of these remaining habitats requires an understanding of ecosystem processes at multiple scales. The focus of this study was to develop and evaluate a modeling framework for analyzing longleaf pine dynamics at the spatially explicit landscape scale and at the spatially implicit population scale. The landscape disturbance and succession (LANDIS) model was used to simulate landscape fire dynamics in a managed forest in north-central Florida. We constructed a density-dependent longleaf pine population matrix model using data from a variety of studies across the southeastern United States to extend an existing model. Sensitivity analyses showed that the most sensitive parameters were those from the original pine model, which was based on extensive observations of individual trees. A hybrid approach integrated the two models: the fire frequencies output from the LANDIS model were input to the matrix model for specific longleaf pine populations. These simulations indicated that small isolated longleaf pine populations are more vulnerable to fire suppression and that landscape connectivity is a critical concern. A frequent prescribed fire regime is nonetheless necessary to maintain even large longleaf pine sandhill communities that have better landscape connectivity.


PLoS Medicine ◽  
2021 ◽  
Vol 18 (4) ◽  
pp. e1003585
Author(s):  
Kyra H. Grantz ◽  
Elizabeth C. Lee ◽  
Lucy D’Agostino McGowan ◽  
Kyu Han Lee ◽  
C. Jessica E. Metcalf ◽  
...  

Background Test-trace-isolate programs are an essential part of Coronavirus Disease 2019 (COVID-19) control that offer a more targeted approach than many other nonpharmaceutical interventions. Effective use of such programs requires methods to estimate their current and anticipated impact. Methods and findings We present a mathematical modeling framework to evaluate the expected reductions in the reproductive number, R, from test-trace-isolate programs. This framework is implemented in a publicly available R package and an online application. We evaluated the effects of completeness in case detection and contact tracing and speed of isolation and quarantine using parameters consistent with COVID-19 transmission (R0: 2.5, generation time: 6.5 days). We show that R is most sensitive to changes in the proportion of cases detected in almost all scenarios, and other metrics have a reduced impact when case detection levels are low (<30%). Although test-trace-isolate programs can contribute substantially to reducing R, exceptional performance across all metrics is needed to bring R below one through test-trace-isolate alone, highlighting the need for comprehensive control strategies. Results from this model also indicate that metrics used to evaluate performance of test-trace-isolate, such as the proportion of identified infections among traced contacts, may be misleading. While estimates of the impact of test-trace-isolate are sensitive to assumptions about COVID-19 natural history and adherence to isolation and quarantine, our qualitative findings are robust across numerous sensitivity analyses. Conclusions Effective test-trace-isolate programs first need to be strong in the “test” component, as case detection underlies all other program activities. Even moderately effective test-trace-isolate programs are an important tool for controlling the COVID-19 pandemic and can alleviate the need for more restrictive social distancing measures.


2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Trong-Ha Nguyen ◽  
Duy-Duan Nguyen

Steel-concrete composite (SCC) beams have been widely used in civil engineering and industrial structures. This kind of structure has some advantages such as fast fabrication time and optimized weight. However, designers are often concerned about the initial reliability, while over time the structural reliability will be reduced, especially due to metal corrosion. The objective of the paper is to assess the structural reliability of corroded SCC beams, in which the input parameters are considered as random variables. The SCC beam has been designed according to Eurocode-4 (EC-4), in which input parameters consist of cross-sectional dimensions of the beam, material properties, and applied loads. The effects of the random input variables on the reliability of structures are evaluated by sensitivity analyses, which are calculated by the global sensitivity analysis using Sobol’s method and Monte Carlo simulation. The developed reliability analysis algorithm in this study is verified with previous studies, highlighting the capability of the used method. Four different corrosion levels, which are pristine, 10-year, 20-year, and 50-year, are considered in the sensitivity analyses of the SCC beam. Finally, a series of first-order and total-order Sobol’s indices are obtained for measuring the sensitivity of input parameters with four corrosion levels.


Author(s):  
Zihan Wang ◽  
Hongyi Xu ◽  
Yang Li

Abstract Metal parts manufactured via Powder Bed Fusion (PBF) process show great potential in industrial applications. Hierarchical, heterogeneous microstructure characteristics of the PBF-built alloys pose a significant challenge to the prediction of structural performances. To enable computational engineering of this type of materials, multiscale microstructure modeling framework has been proposed to predict the stochastic material properties. AlSi10Mg built by Selective Laser Melting (SLM) is selected as the demonstrative example. At the microscale, the epitaxial granular structures are reconstructed based on Scanning Electron Microscopic Electron Backscatter Diffraction (SEM EBSD) images. The microscale analysis provides property inputs for the mesoscale model, which captures the fish scale like melt pools at the millimeter scale. The predicted material properties are compared with the experimental data for further calibration of the material constitutive models. One critical challenge is that some parameters in material models cannot be directly obtained from experimental tests. In this work, we establish a machine learning-based model calibration framework to predict the unknown material parameters. Furthermore, several machine learning methods are compared to shed lights on their capability of capturing the relation between input parameter values and the resultant prediction errors.


Sign in / Sign up

Export Citation Format

Share Document