Confidence in Solutions of Flow Through Stochastically Generated Hard Rock Formations

1985 ◽  
Vol 50 ◽  
Author(s):  
Carol Braester ◽  
Roger Thunvik

AbstractConfidence in solutions of flow through stochastically generated hard rock formations was studied with the aid of a simplified synthetic model. The formation is conceptualized as a fracture network intersecting an impervious rock mass. The geometrical properties of the fracture network were assumed to be known while fracture transmissivities were considered a stochastic process.First, network fracture transmissivities were generated using a given probability distribution function. This a priori model was considered the “true formation”. In a second step, only a limited amount of information, similar to that obtained in reality from boreholes, was used to construct a conditioned-by-measurement model.Identical flow tests were performed on a formation constructed with limited data and on the “true formation”. The ratio of the rates of flow resulting from these tests was considered a measure of confidence in the stochastically generated formation. The results, with this model and a particular data set, show uncertainty values between 47% to 63%, corresponding to fracture sample sizes of 11% and 2% respectively, from the total number in the network.

2014 ◽  
Vol 70 (a1) ◽  
pp. C1728-C1728
Author(s):  
Richard Cooper ◽  
James Arnold

Standard crystallographic structure refinements employ anisotropic displacement parameters (ADPs) to represent the probability distribution of a scattering atom. Such distributions may be due to thermal motion of the atom and / or a spatial average of multiple discrete atomic positions. An anisotropic description of an atomic distribution requires six parameters, and - in cases where data is limited or poor quality - the optimal values of these parameters may be ill-defined. Application of restraints and constraints can impose some physical and chemical reality on the set of displacement parameters. Examples include those based on the Hirshfeld Rigid Bond Test [1], and more recently SHELXL's RIGU [2]. We have implemented these and other a.d.p. restraints in CRYSTALS [3], for introducing reasonable relationships amongst common arrangements of anisotropic atoms. Use of a priori information in the form of restraints must always be justified, and we present an assessment of the applicability of the new restraints against a large data set of high quality crystal structure determinations.


2021 ◽  
Vol 4 (1) ◽  
pp. 251524592095492
Author(s):  
Marco Del Giudice ◽  
Steven W. Gangestad

Decisions made by researchers while analyzing data (e.g., how to measure variables, how to handle outliers) are sometimes arbitrary, without an objective justification for choosing one alternative over another. Multiverse-style methods (e.g., specification curve, vibration of effects) estimate an effect across an entire set of possible specifications to expose the impact of hidden degrees of freedom and/or obtain robust, less biased estimates of the effect of interest. However, if specifications are not truly arbitrary, multiverse-style analyses can produce misleading results, potentially hiding meaningful effects within a mass of poorly justified alternatives. So far, a key question has received scant attention: How does one decide whether alternatives are arbitrary? We offer a framework and conceptual tools for doing so. We discuss three kinds of a priori nonequivalence among alternatives—measurement nonequivalence, effect nonequivalence, and power/precision nonequivalence. The criteria we review lead to three decision scenarios: Type E decisions (principled equivalence), Type N decisions (principled nonequivalence), and Type U decisions (uncertainty). In uncertain scenarios, multiverse-style analysis should be conducted in a deliberately exploratory fashion. The framework is discussed with reference to published examples and illustrated with the help of a simulated data set. Our framework will help researchers reap the benefits of multiverse-style methods while avoiding their pitfalls.


2020 ◽  
Vol 72 (1) ◽  
Author(s):  
Ryuho Kataoka

Abstract Statistical distributions are investigated for magnetic storms, sudden commencements (SCs), and substorms to identify the possible amplitude of the one in 100-year and 1000-year events from a limited data set of less than 100 years. The lists of magnetic storms and SCs are provided from Kakioka Magnetic Observatory, while the lists of substorms are obtained from SuperMAG. It is found that majorities of events essentially follow the log-normal distribution, as expected from the random output from a complex system. However, it is uncertain that large-amplitude events follow the same log-normal distributions, and rather follow the power-law distributions. Based on the statistical distributions, the probable amplitudes of the 100-year (1000-year) events can be estimated for magnetic storms, SCs, and substorms as approximately 750 nT (1100 nT), 230 nT (450 nT), and 5000 nT (6200 nT), respectively. The possible origin to cause the statistical distributions is also discussed, consulting the other space weather phenomena such as solar flares, coronal mass ejections, and solar energetic particles.


2015 ◽  
Vol 8 (2) ◽  
pp. 941-963 ◽  
Author(s):  
T. Vlemmix ◽  
F. Hendrick ◽  
G. Pinardi ◽  
I. De Smedt ◽  
C. Fayt ◽  
...  

Abstract. A 4-year data set of MAX-DOAS observations in the Beijing area (2008–2012) is analysed with a focus on NO2, HCHO and aerosols. Two very different retrieval methods are applied. Method A describes the tropospheric profile with 13 layers and makes use of the optimal estimation method. Method B uses 2–4 parameters to describe the tropospheric profile and an inversion based on a least-squares fit. For each constituent (NO2, HCHO and aerosols) the retrieval outcomes are compared in terms of tropospheric column densities, surface concentrations and "characteristic profile heights" (i.e. the height below which 75% of the vertically integrated tropospheric column density resides). We find best agreement between the two methods for tropospheric NO2 column densities, with a standard deviation of relative differences below 10%, a correlation of 0.99 and a linear regression with a slope of 1.03. For tropospheric HCHO column densities we find a similar slope, but also a systematic bias of almost 10% which is likely related to differences in profile height. Aerosol optical depths (AODs) retrieved with method B are 20% high compared to method A. They are more in agreement with AERONET measurements, which are on average only 5% lower, however with considerable relative differences (standard deviation ~ 25%). With respect to near-surface volume mixing ratios and aerosol extinction we find considerably larger relative differences: 10 ± 30, −23 ± 28 and −8 ± 33% for aerosols, HCHO and NO2 respectively. The frequency distributions of these near-surface concentrations show however a quite good agreement, and this indicates that near-surface concentrations derived from MAX-DOAS are certainly useful in a climatological sense. A major difference between the two methods is the dynamic range of retrieved characteristic profile heights which is larger for method B than for method A. This effect is most pronounced for HCHO, where retrieved profile shapes with method A are very close to the a priori, and moderate for NO2 and aerosol extinction which on average show quite good agreement for characteristic profile heights below 1.5 km. One of the main advantages of method A is the stability, even under suboptimal conditions (e.g. in the presence of clouds). Method B is generally more unstable and this explains probably a substantial part of the quite large relative differences between the two methods. However, despite a relatively low precision for individual profile retrievals it appears as if seasonally averaged profile heights retrieved with method B are less biased towards a priori assumptions than those retrieved with method A. This gives confidence in the result obtained with method B, namely that aerosol extinction profiles tend on average to be higher than NO2 profiles in spring and summer, whereas they seem on average to be of the same height in winter, a result which is especially relevant in relation to the validation of satellite retrievals.


Geophysics ◽  
2007 ◽  
Vol 72 (1) ◽  
pp. F25-F34 ◽  
Author(s):  
Benoit Tournerie ◽  
Michel Chouteau ◽  
Denis Marcotte

We present and test a new method to correct for the static shift affecting magnetotelluric (MT) apparent resistivity sounding curves. We use geostatistical analysis of apparent resistivity and phase data for selected periods. For each period, we first estimate and model the experimental variograms and cross variogram between phase and apparent resistivity. We then use the geostatistical model to estimate, by cokriging, the corrected apparent resistivities using the measured phases and apparent resistivities. The static shift factor is obtained as the difference between the logarithm of the corrected and measured apparent resistivities. We retain as final static shift estimates the ones for the period displaying the best correlation with the estimates at all periods. We present a 3D synthetic case study showing that the static shift is retrieved quite precisely when the static shift factors are uniformly distributed around zero. If the static shift distribution has a nonzero mean, we obtained best results when an apparent resistivity data subset can be identified a priori as unaffected by static shift and cokriging is done using only this subset. The method has been successfully tested on the synthetic COPROD-2S2 2D MT data set and on a 3D-survey data set from Las Cañadas Caldera (Tenerife, Canary Islands) severely affected by static shift.


Paleobiology ◽  
2016 ◽  
Vol 43 (1) ◽  
pp. 68-84 ◽  
Author(s):  
Bradley Deline ◽  
William I. Ausich

AbstractA priori choices in the detail and breadth of a study are important in addressing scientific hypotheses. In particular, choices in the number and type of characters can greatly influence the results in studies of morphological diversity. A new character suite was constructed to examine trends in the disparity of early Paleozoic crinoids. Character-based rarefaction analysis indicated that a small subset of these characters (~20% of the complete data set) could be used to capture most of the properties of the entire data set in analyses of crinoids as a whole, noncamerate crinoids, and to a lesser extent camerate crinoids. This pattern may be the result of the covariance between characters and the characterization of rare morphologies that are not represented in the primary axes in morphospace. Shifting emphasis on different body regions (oral system, calyx, periproct system, and pelma) also influenced estimates of relative disparity between subclasses of crinoids. Given these results, morphological studies should include a pilot analysis to better examine the amount and type of data needed to address specific scientific hypotheses.


2021 ◽  
Vol 70 (10) ◽  
Author(s):  
Kazuyoshi Gotoh ◽  
Makoto Miyoshi ◽  
I Putu Bayu Mayura ◽  
Koji Iio ◽  
Osamu Matsushita ◽  
...  

The options available for treating infections with carbapenemase-producing Enterobacteriaceae (CPE) are limited; with the increasing threat of these infections, new treatments are urgently needed. Biapenem (BIPM) is a carbapenem, and limited data confirming its in vitro killing effect against CPE are available. In this study, we examined the minimum inhibitory concentrations (MICs) and minimum bactericidal concentrations (MBCs) of BIPM for 14 IMP-1-producing Enterobacteriaceae strains isolated from the Okayama region in Japan. The MICs against almost all the isolates were lower than 0.5 µg ml−1, indicating susceptibility to BIPM, while approximately half of the isolates were confirmed to be bacteriostatic to BIPM. However, initial killing to a 99.9 % reduction was observed in seven out of eight strains in a time–kill assay. Despite the small data set, we concluded that the in vitro efficacy of BIPM suggests that the drug could be a new therapeutic option against infection with IMP-producing CPE.


2021 ◽  
Author(s):  
Monique B. Sager ◽  
Aditya M. Kashyap ◽  
Mila Tamminga ◽  
Sadhana Ravoori ◽  
Christopher Callison-Burch ◽  
...  

BACKGROUND Reddit, the fifth most popular website in the United States, boasts a large and engaged user base on its dermatology forums where users crowdsource free medical opinions. Unfortunately, much of the advice provided is unvalidated and could lead to inappropriate care. Initial testing has shown that artificially intelligent bots can detect misinformation on Reddit forums and may be able to produce responses to posts containing misinformation. OBJECTIVE To analyze the ability of bots to find and respond to health misinformation on Reddit’s dermatology forums in a controlled test environment. METHODS Using natural language processing techniques, we trained bots to target misinformation using relevant keywords and to post pre-fabricated responses. By evaluating different model architectures across a held-out test set, we compared performances. RESULTS Our models yielded data test accuracies ranging from 95%-100%, with a BERT fine-tuned model resulting in the highest level of test accuracy. Bots were then able to post corrective pre-fabricated responses to misinformation. CONCLUSIONS Using a limited data set, bots had near-perfect ability to detect these examples of health misinformation within Reddit dermatology forums. Given that these bots can then post pre-fabricated responses, this technique may allow for interception of misinformation. Providing correct information, even instantly, however, does not mean users will be receptive or find such interventions persuasive. Further work should investigate this strategy’s effectiveness to inform future deployment of bots as a technique in combating health misinformation. CLINICALTRIAL N/A


2018 ◽  
Vol 482 (1) ◽  
pp. 241-260 ◽  
Author(s):  
V. Tsitsopoulos ◽  
S. Baxter ◽  
D. Holton ◽  
J. Dodd ◽  
S. Williams ◽  
...  

AbstractThe Prototype Repository (PR) tunnel is located at the Äspö Hard Rock Laboratory near Oskarshamn in the southeast of Sweden. In the PR tunnel, six full-sized deposition holes (8.37 m deep and 1.75 m in diameter) have been constructed. Each deposition hole is designed to mimic the Swedish reference system for the disposal of nuclear fuel, KBS-3V. The PR experiment is designed to provide a full-scale simulation of the emplacement of heat-generating waste. There are three phases to the experiment: (1) the open tunnel phase following construction, where both the tunnel and deposition holes are open to atmospheric conditions; (2) the emplacement of canisters (containing heaters), backfill and seal in the first section of the tunnel; and (3) the emplacement of canisters, backfill and seal in the second section of the tunnel. This work describes the numerical modelling, performed as part of the engineered barrier systems (EBS) Task Force, to understand the thermo-hydraulic (TH) evolution of the PR experiment and to provide a better understanding of the interaction between the fractured rock and bentonite surrounding the canister at the scale of a single deposition tunnel. A coupled integrated TH model for predicting the wetting and the temperature of bentonite emplaced in fractured rock was developed, accounting for the heterogeneity of the fractured rock. In this model, geometrical uncertainties of fracture locations are modelled by using several stochastic realizations of the fracture network. The modelling methodology utilized information available at early stages of site characterization and included site statistics for fracture occurrence and properties, as well as proposed installation properties of the bentonite. The adopted approach provides an evaluation of the predictive capability of models, it gives an insight of the uncertainties to data and demonstrates that a simplified equivalent homogeneous description of the fractured host rock is insufficient to represent the bentonite resaturation.


2014 ◽  
Vol 36 ◽  
pp. 69-75 ◽  
Author(s):  
A. D'Alessandro ◽  
I. Guerra ◽  
G. D'Anna ◽  
A. Gervasi ◽  
P. Harabaglia ◽  
...  

Abstract. We plan to deploy in the Taranto Gulf some Ocean Bottom broadband Seismometer with Hydrophones. Our aim is to investigate the offshore seismicity of the Sibari Gulf. The seismographic network optimization consists in the identification of the optimal sites for the installation of the offshore stations, which is a crucial factor for the success of the monitoring campaign. In this paper, we propose a two steps automatic procedure for the identification of the best stations geometry. In the first step, based on the application of a set of a priori criteria, the suitable sites to host the ocean bottom seismic stations are identified. In the second step, the network improvement is evaluated for all the possible stations geometries by means of numerical simulation. The application of this procedure allows us to identify the best stations geometry to be achieved in the monitoring campaign.


Sign in / Sign up

Export Citation Format

Share Document