scholarly journals A Confront between Amati and Combo Correlations at Intermediate and Early Redshifts

Symmetry ◽  
2020 ◽  
Vol 12 (7) ◽  
pp. 1118
Author(s):  
Marco Muccino

I consider two gamma-ray burst (GRB) correlations: Amati and Combo. After calibrating them in a cosmology-independent way by employing Beziér polynomials to approximate the Observational Hubble Dataset (OHD), I perform Markov Chain Monte Carlo (MCMC) simulations within the Λ CDM and the wCDM models. The results from the Amati GRB dataset do not agree with the standard Λ CDM model at a confidence level ≥ 3 – σ . For the Combo correlation, all MCMC simulations give best-fit parameters which are consistent within 1– σ with the Λ CDM model. Pending the clarification of whether the diversity of these results is statistical, due to the difference in the dataset sizes, or astrophysical, implying the search for the most suited correlation for cosmological analyses, future investigations require larger datasets to increase the predictive power of both correlations and enable more refined analyses on the possible non-zero curvature of the Universe and the dark energy equation of state and evolution.

2010 ◽  
Vol 25 (17) ◽  
pp. 1441-1454 ◽  
Author(s):  
LIXIN XU ◽  
ZHAOFEI LIU ◽  
JIANBO LU ◽  
WENBO LI

In this paper, the holographic dark energy in Brans–Dicke theory is confronted by cosmic observations from SN Ia, BAO, OHD and CMB via Markov-Chain Monte-Carlo (MCMC) method. The best fit parameters are found in 1σ region: [Formula: see text] and [Formula: see text] (equivalently ω = 2415.653 which is less than the solar system bound and consistent with other constraint results). With these best fit values of the parameters, it is found that the universe is undergoing accelerated expansion, and the current value of equation of state of holographic dark energy [Formula: see text] which is phantom like in Brans–Dicke theory. The effective Newton's constant decreases with the expansion of our universe for the negative value of model parameter α.


2008 ◽  
Vol 10 (2) ◽  
pp. 153-162 ◽  
Author(s):  
B. G. Ruessink

When a numerical model is to be used as a practical tool, its parameters should preferably be stable and consistent, that is, possess a small uncertainty and be time-invariant. Using data and predictions of alongshore mean currents flowing on a beach as a case study, this paper illustrates how parameter stability and consistency can be assessed using Markov chain Monte Carlo. Within a single calibration run, Markov chain Monte Carlo estimates the parameter posterior probability density function, its mode being the best-fit parameter set. Parameter stability is investigated by stepwise adding new data to a calibration run, while consistency is examined by calibrating the model on different datasets of equal length. The results for the present case study indicate that various tidal cycles with strong (say, >0.5 m/s) currents are required to obtain stable parameter estimates, and that the best-fit model parameters and the underlying posterior distribution are strongly time-varying. This inconsistent parameter behavior may reflect unresolved variability of the processes represented by the parameters, or may represent compensational behavior for temporal violations in specific model assumptions.


Author(s):  
L Kazantzidis ◽  
H Koo ◽  
S Nesseris ◽  
L Perivolaropoulos ◽  
A Shafieloo

Abstract We search for possible deviations from the expectations of the concordance ΛCDM model in the expansion history of the Universe by analysing the Pantheon Type Ia Supernovae (SnIa) compilation along with its Monte Carlo simulations using redshift binning. We demonstrate that the redshift binned best fit ΛCDM matter density parameter Ω0m and the best fit effective absolute magnitude $\cal M$ oscillate about their full dataset best fit values with considerably large amplitudes. Using the full covariance matrix of the data taking into account systematic and statistical errors, we show that at the redshifts below z ≈ 0.5 such oscillations can only occur in 4 to 5% of the Monte Carlo simulations. While statistical fluctuations can be responsible for this apparent oscillation, we might have observed a hint for some behaviour beyond the expectations of the concordance model or a possible additional systematic in the data. If this apparent oscillation is not due to statistical or systematic effects, it could be due to either the presence of coherent inhomogeneities at low z or due to oscillations of a quintessence scalar field.


2014 ◽  
Vol 575 ◽  
pp. 549-553 ◽  
Author(s):  
Rahadi Wirawan ◽  
Mitra Djamal ◽  
Abdul Waris ◽  
Gunawan Handayani ◽  
Hong Joo Kim

Incoherent gamma ray scattering is a method that can be applied for the fluid parameter characterization. The aim of the present work is to study the potential usage of the incoherent gamma ray scattering measurements to evaluate the fluid density based on the Monte Carlo approach. Enlarging the density of a fluid results in a significant reduction in the intensity of the detected gamma scattering. The difference of the simulation curve slope results in the gamma transmission mode its about 0.02 compared to the experimental result.


2013 ◽  
Vol 22 (03) ◽  
pp. 1330004 ◽  
Author(s):  
TIMOTHY CLIFTON

We introduce the concept of back-reaction in relativistic cosmological modeling. Roughly speaking, this can be thought of as the difference between the large-scale behavior of an inhomogeneous cosmological solution of Einstein's equations, and a homogeneous and isotropic solution that is a best-fit to either the average of observables or dynamics in the inhomogeneous solution. This is sometimes paraphrased as "the effect that structure has of the large-scale evolution of the universe." Various different approaches have been taken in the literature in order to try and understand back-reaction in cosmology. We provide a brief and critical summary of some of them, highlighting recent progress that has been made in each case.


2015 ◽  
Vol 11 (S319) ◽  
pp. 3-4
Author(s):  
Zsolt Bagoly ◽  
István I. Rácz ◽  
Lajos G. Balázs ◽  
L. Viktor Tóth ◽  
István Horváth

AbstractWe studied the space distribution of the starburst galaxies from Millennium XXL database at z = 0.82. We examined the starburst distribution in the classical Millennium I (De Lucia et al. (2006)) using a semi-analytical model for the genesis of the galaxies. We simulated a starburst galaxies sample with Markov Chain Monte Carlo method. The connection between the large scale structures homogenous and starburst groups distribution (Kofman and Shandarin 1998), Suhhonenko et al. (2011), Liivamägi et al. (2012), Park et al. (2012), Horvath et al. (2014), Horvath et al. (2015)) on a defined scale were checked too.


2010 ◽  
Vol 6 (S274) ◽  
pp. 175-177
Author(s):  
P. Procopio ◽  
A. De Rosa ◽  
C. Burigana ◽  
G. Umana ◽  
C. Trigilio

AbstractWe propose a modeling study on the formation and evolution of the Circumstellar Envelopes (CSEs) of a sample of selected radio-loud objects, based on an innovative interaction between two codes widely used by the scientific community, but in different fields. CLOUDY (Ferland et al. 1998) is a widely used code to model the spectral energy distribution (SED) of the several objects characterized by clouds of gas heated and ionized by a central object. CosmoMC (Lewis & Bridle 2002) instead is usually used for exploring cosmological parameter space. We investigate here on the exploitation of the sampling performance of the Markov-Chain Monte-Carlo (MCMC) engine of CosmoMC to search for a best fit model of the considered objects through the spectral synthesis capacity of CLOUDY.


2021 ◽  
Vol 502 (3) ◽  
pp. 4009-4025
Author(s):  
Trystyn A M Berg ◽  
Michele Fumagalli ◽  
Valentina D’Odorico ◽  
Sara L Ellison ◽  
Sebastián López ◽  
...  

ABSTRACT We present the measured gas-phase metal column densities in 155 sub-damped Ly α systems (subDLAs) with the aim to investigate the contribution of subDLAs to the chemical evolution of the Universe. The sample was identified within the absorber-blind XQ-100 quasar spectroscopic survey over the redshift range 2.4 ≤ zabs ≤ 4.3. Using all available column densities of the ionic species investigated (mainly C iv, Si ii, Mg ii, Si iv, Al ii, Fe ii, C ii, and O i; in order of decreasing detection frequency), we estimate the ionization-corrected gas-phase metallicity of each system using Markov chain Monte Carlo techniques to explore a large grid of cloudy ionization models. Without accounting for ionization and dust depletion effects, we find that the H i-weighted gas-phase metallicity evolution of subDLAs is consistent with damped Ly α systems (DLAs). When ionization corrections are included, subDLAs are systematically more metal poor than DLAs (between ≈0.5σ and ≈3σ significance) by up to ≈1.0 dex over the redshift range 3 ≤ zabs ≤ 4.3. The correlation of gas phase [Si/Fe] with metallicity in subDLAs appears to be consistent with that of DLAs, suggesting that the two classes of absorbers have a similar relative dust depletion pattern. As previously seen for Lyman limit systems, the gas phase [C/O] in subDLAs remains constantly solar for all metallicities indicating that both subDLAs and Lyman limit systems could trace carbon-rich ejecta, potentially in circumgalactic environments.


Author(s):  
Jing Ding ◽  
Yizhuang David Wang ◽  
Saqib Gulzar ◽  
Youngsoo Richard Kim ◽  
B. Shane Underwood

The simplified viscoelastic continuum damage model (S-VECD) has been widely accepted as a computationally efficient and a rigorous mechanistic model to predict the fatigue resistance of asphalt concrete. It operates in a deterministic framework, but in actual practice, there are multiple sources of uncertainty such as specimen preparation errors and measurement errors which need to be probabilistically characterized. In this study, a Bayesian inference-based Markov Chain Monte Carlo method is used to quantify the uncertainty in the S-VECD model. The dynamic modulus and cyclic fatigue test data from 32 specimens are used for parameter estimation and predictive envelope calculation of the dynamic modulus, damage characterization and failure criterion model. These parameter distributions are then propagated to quantify the uncertainty in fatigue prediction. The predictive envelope for each model is further used to analyze the decrease in variance with the increase in the number of replicates. Finally, the proposed methodology is implemented to compare three asphalt concrete mixtures from standard testing. The major findings of this study are: (1) the parameters in the dynamic modulus and damage characterization model have relatively strong correlation which indicates the necessity of Bayesian techniques; (2) the uncertainty of the damage characteristic curve for a single specimen propagated from parameter uncertainties of the dynamic modulus model is negligible compared to the difference in the replicates; (3) four replicates of the cyclic fatigue test are recommended considering the balance between the uncertainty of fatigue prediction and the testing efficiency; and (4) more replicates are needed to confidently detect the difference between different mixtures if their fatigue performance is close.


Sign in / Sign up

Export Citation Format

Share Document