scholarly journals Uncertainty Visualization of Transport Variance in a Time-Varying Ensemble Vector Field

2020 ◽  
Vol 9 (1) ◽  
pp. 19
Author(s):  
Ke Ren ◽  
Dezhan Qu ◽  
Shaobin Xu ◽  
Xufeng Jiao ◽  
Liang Tai ◽  
...  

Uncertainty analysis of a time-varying ensemble vector field is a challenging topic in geoscience. Due to the complex data structure, the uncertainty of a time-varying ensemble vector field is hard to quantify and analyze. Measuring the differences between pathlines is an effective way to compute the uncertainty. However, existing metrics are not accurate enough or are sensitive to outliers; thus, a comprehensive tool for the further analysis of the uncertainty of transport patterns is required. In this paper, we propose a novel framework for quantifying and analyzing the uncertainty of an ensemble vector field. Based on the classical edit distance on real sequence (EDR) method, a robust and accurate metric was proposed to measure the pathline uncertainty. Considering the spatial continuity, we computed the transport variance of the neighborhood of a location, and evaluated the uncertainty correlation between each location and its neighborhood by using the local Moran’s I. Based on the proposed uncertainty measurements, a visual analysis system called UP-Vis (uncertainty pathline visualization) was developed to interactively explore the uncertainty. It provides an overview of the uncertainty and supports detailed exploration of transport patterns at a selected location, and allows for the comparison of transport patterns between a location and its neighborhood. Through pathline clustering, the major trends of the ensemble pathline at a location were extracted. Moreover, a glyph was designed to intuitively display the transport direction and diverging degree of each cluster. For the uncertainty analysis of the neighborhood, a comparison view was designed to compare the transport patterns between a location and its neighborhood in detail. A synthetic data set and weather simulation data set were used in our experiments. The evaluation and case studies demonstrated that the proposed framework can measure the uncertainty effectively and help users to comprehensively explore uncertainty transport patterns.

Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 683
Author(s):  
Albert Podusenko ◽  
Wouter M. Kouw ◽  
Bert de Vries

Time-varying autoregressive (TVAR) models are widely used for modeling of non-stationary signals. Unfortunately, online joint adaptation of both states and parameters in these models remains a challenge. In this paper, we represent the TVAR model by a factor graph and solve the inference problem by automated message passing-based inference for states and parameters. We derive structured variational update rules for a composite “AR node” with probabilistic observations that can be used as a plug-in module in hierarchical models, for example, to model the time-varying behavior of the hyper-parameters of a time-varying AR model. Our method includes tracking of variational free energy (FE) as a Bayesian measure of TVAR model performance. The proposed methods are verified on a synthetic data set and validated on real-world data from temperature modeling and speech enhancement tasks.


Geophysics ◽  
2007 ◽  
Vol 72 (4) ◽  
pp. V79-V86 ◽  
Author(s):  
Kurang Mehta ◽  
Andrey Bakulin ◽  
Jonathan Sheiman ◽  
Rodney Calvert ◽  
Roel Snieder

The virtual source method has recently been proposed to image and monitor below complex and time-varying overburden. The method requires surface shooting recorded at downhole receivers placed below the distorting or changing part of the overburden. Redatuming with the measured Green’s function allows the reconstruction of a complete downhole survey as if the sources were also buried at the receiver locations. There are still some challenges that need to be addressed in the virtual source method, such as limited acquisition aperture and energy coming from the overburden. We demonstrate that up-down wavefield separation can substantially improve the quality of virtual source data. First, it allows us to eliminate artifacts associated with the limited acquisition aperture typically used in practice. Second, it allows us to reconstruct a new optimized response in the absence of downgoing reflections and multiples from the overburden. These improvements are illustrated on a synthetic data set of a complex layered model modeled after the Fahud field in Oman, and on ocean-bottom seismic data acquired in the Mars field in the deepwater Gulf of Mexico.


Author(s):  
Aparna Agarwal ◽  
Deevyankar Agarwal

The real commercialisminformation often demonstrates temporal feature and time varying behavior. Temporal affiliation rule has thus got an active area of explore. A calendar part such as months and days, clock parts such as hours and seconds and differentiated units such as business days and academic years, act a major role in a wide range of information system applications. The calendar-based form has already been proposed by explorers to restrict the time-based association ships. This paper advises a novel algorithmic program to determine association rule on time dependent data employingeffective T tree and P-tree data structures. The algorithm complicates the significant advantage in terms of time and memory while comprising time dimension. Our approach path of scanning based on time-intervals yields littlerinformation set for a given valid interval thus cutting down the processing time. This approach is enforced on a synthetic data-set and result shows that temporal TFP tree collapses better performance over a TFP tree access.


Geophysics ◽  
2019 ◽  
Vol 84 (3) ◽  
pp. E209-E223
Author(s):  
Juan Luis Fernández-Martínez ◽  
Zulima Fernández-Muñiz ◽  
Shan Xu ◽  
Ana Cernea ◽  
Colette Sirieix ◽  
...  

We have evaluated the uncertainty analysis of the 3D electrical tomography inverse problem using model reduction via singular-value decomposition and performed sampling of the nonlinear equivalence region via an explorative member of the particle swarm optimization (PSO) family. The procedure begins with the local inversion of the observed data to find a good resistivity model located in the nonlinear equivalence region. Then, the dimensionality is reduced via the spectral decomposition of the 3D geophysical model. Finally, the exploration of the uncertainty space is performed via an exploratory version of PSO (RR-PSO). This sampling methodology does not prejudge where the initial model comes from as long as this model has a geologic meaning. The 3D subsurface conductivity distribution is arranged as a 2D matrix by ordering the conductivity values contained in a given earth section as a column array and stacking parallel sections as columns of the matrix. There are three basic modes of ordering: mode 1 and mode 2, by using vertical sections in two perpendicular directions, and mode 3, by using horizontal sections. The spectral decomposition is then performed using these three 2D modes. Using this approach, it is possible to sample the uncertainty space of the 3D electrical resistivity inverse problem very efficiently. This methodology is intrinsically parallelizable and could be run for different initial models simultaneously. We found the application to a synthetic data set that is well-known in the literature related to this subject, obtaining a set of surviving geophysical models located in the nonlinear equivalence region that can be used to approximate numerically the posterior distribution of the geophysical model parameters (frequentist approach). Based on these models, it is possible to perform the probabilistic segmentation of the inverse solution found, meanwhile answering geophysical questions with its corresponding uncertainty assessment. This methodology has a general character could be applied to any other 3D nonlinear inverse problems by implementing their corresponding forward model.


Author(s):  
Raul E. Avelar ◽  
Karen Dixon ◽  
Boniphace Kutela ◽  
Sam Klump ◽  
Beth Wemple ◽  
...  

The calibration of safety performance functions (SPFs) is a mechanism included in the Highway Safety Manual (HSM) to adjust SPFs in the HSM for use in intended jurisdictions. Critically, the quality of the calibration procedure must be assessed before using the calibrated SPFs. Multiple resources to aid practitioners in calibrating SPFs have been developed in the years following the publication of the HSM 1st edition. Similarly, the literature suggests multiple ways to assess the goodness-of-fit (GOF) of a calibrated SPF to a data set from a given jurisdiction. This paper uses the calibration results of multiple intersection SPFs to a large Mississippi safety database to examine the relations between multiple GOF metrics. The goal is to develop a sensible single index that leverages the joint information from multiple GOF metrics to assess overall quality of calibration. A factor analysis applied to the calibration results revealed three underlying factors explaining 76% of the variability in the data. From these results, the authors developed an index and performed a sensitivity analysis. The key metrics were found to be, in descending order: the deviation of the cumulative residual (CURE) plot from the 95% confidence area, the mean absolute deviation, the modified R-squared, and the value of the calibration factor. This paper also presents comparisons between the index and alternative scoring strategies, as well as an effort to verify the results using synthetic data. The developed index is recommended to comprehensively assess the quality of the calibrated intersection SPFs.


Water ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 107
Author(s):  
Elahe Jamalinia ◽  
Faraz S. Tehrani ◽  
Susan C. Steele-Dunne ◽  
Philip J. Vardon

Climatic conditions and vegetation cover influence water flux in a dike, and potentially the dike stability. A comprehensive numerical simulation is computationally too expensive to be used for the near real-time analysis of a dike network. Therefore, this study investigates a random forest (RF) regressor to build a data-driven surrogate for a numerical model to forecast the temporal macro-stability of dikes. To that end, daily inputs and outputs of a ten-year coupled numerical simulation of an idealised dike (2009–2019) are used to create a synthetic data set, comprising features that can be observed from a dike surface, with the calculated factor of safety (FoS) as the target variable. The data set before 2018 is split into training and testing sets to build and train the RF. The predicted FoS is strongly correlated with the numerical FoS for data that belong to the test set (before 2018). However, the trained model shows lower performance for data in the evaluation set (after 2018) if further surface cracking occurs. This proof-of-concept shows that a data-driven surrogate can be used to determine dike stability for conditions similar to the training data, which could be used to identify vulnerable locations in a dike network for further examination.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S79-S80
Author(s):  
Joanne Huang ◽  
Zahra Kassamali Escobar ◽  
Rupali Jain ◽  
Jeannie D Chan ◽  
John B Lynch ◽  
...  

Abstract Background In an effort to support stewardship endeavors, the MITIGATE (a Multifaceted Intervention to Improve Prescribing for Acute Respiratory Infection for Adult and Children in Emergency Department and Urgent Care Settings) Toolkit was published in 2018, aiming to reduce unnecessary antibiotics for viral respiratory tract infections (RTIs). At the University of Washington, we have incorporated strategies from this toolkit at our urgent care clinics. This study aims to address solutions to some of the challenges we experienced. Challenges and Solutions Methods This was a retrospective observational study conducted at Valley Medical Center (Sept 2019-Mar 2020) and the University of Washington (Jan 2019-Feb 2020) urgent care clinics. Patients were identified through ICD-10 diagnosis codes included in the MITIGATE toolkit. The primary outcome was identifying challenges and solutions developed during this process. Results We encountered five challenges during our roll-out of MITIGATE. First, using both ICD-9 and ICD-10 codes can lead to inaccurate data collection. Second, technical support for coding a complex data set is essential and should be accounted for prior to beginning stewardship interventions of this scale. Third, unintentional incorrect diagnosis selection was common and may require reeducation of prescribers on proper selection. Fourth, focusing on singular issues rather than multiple outcomes is more feasible and can offer several opportunities for stewardship interventions. Lastly, changing prescribing behavior can cause unintended tension during implementation. Modifying benchmarks measured, allowing for bi-directional feedback, and identifying provider champions can help maintain open communication. Conclusion Resources such as the MITIGATE toolkit are helpful to implement standardized data driven stewardship interventions. We have experienced some challenges including a complex data build, errors with diagnostic coding, providing constructive feedback while maintaining positive stewardship relationships, and choosing feasible outcomes to measure. We present solutions to these challenges with the aim to provide guidance to those who are considering using this toolkit for outpatient stewardship interventions. Disclosures All Authors: No reported disclosures


2021 ◽  
pp. 135481662110088
Author(s):  
Sefa Awaworyi Churchill ◽  
John Inekwe ◽  
Kris Ivanovski

Using a historical data set and recent advances in non-parametric time series modelling, we investigate the nexus between tourism flows and house prices in Germany over nearly 150 years. We use time-varying non-parametric techniques given that historical data tend to exhibit abrupt changes and other forms of non-linearities. Our findings show evidence of a time-varying effect of tourism flows on house prices, although with mixed effects. The pre-World War II time-varying estimates of tourism show both positive and negative effects on house prices. While changes in tourism flows contribute to increasing housing prices over the post-1950 period, this is short-lived, and the effect declines until the mid-1990s. However, we find a positive and significant relationship after 2000, where the impact of tourism on house prices becomes more pronounced in recent years.


2021 ◽  
pp. 1-11
Author(s):  
Yanan Huang ◽  
Yuji Miao ◽  
Zhenjing Da

The methods of multi-modal English event detection under a single data source and isomorphic event detection of different English data sources based on transfer learning still need to be improved. In order to improve the efficiency of English and data source time detection, based on the transfer learning algorithm, this paper proposes multi-modal event detection under a single data source and isomorphic event detection based on transfer learning for different data sources. Moreover, by stacking multiple classification models, this paper makes each feature merge with each other, and conducts confrontation training through the difference between the two classifiers to further make the distribution of different source data similar. In addition, in order to verify the algorithm proposed in this paper, a multi-source English event detection data set is collected through a data collection method. Finally, this paper uses the data set to verify the method proposed in this paper and compare it with the current most mainstream transfer learning methods. Through experimental analysis, convergence analysis, visual analysis and parameter evaluation, the effectiveness of the algorithm proposed in this paper is demonstrated.


Sign in / Sign up

Export Citation Format

Share Document