Intermittency and related issues in 16O-Ag/Br collision at 200A GeV/c

2010 ◽  
Vol 88 (8) ◽  
pp. 575-584 ◽  
Author(s):  
M. K. Ghosh ◽  
P. K. Haldar ◽  
S. K. Manna ◽  
A. Mukhopadhyay ◽  
G. Singh

In this paper we present some results on the nonstatistical fluctuation in the 1-dimensional (1-d) density distribution of singly charged produced particles in the framework of the intermittency phenomenon. A set of nuclear emulsion data on 16O-Ag/Br interactions at an incident momentum of 200A GeV/c, was analyzed in terms of different statistical methods that are related to the self-similar fractal properties of the particle density function. A comparison of the present experiment with a similar experiment induced by the 32S nuclei and also with a set of results simulated by the Lund Monte Carlo code FRITIOF is presented. A similar comparison between this experiment and a pseudo-random number generated simulated data set is also made. The analysis reveals the presence of a weak intermittency in the 1-d phase space distribution of the produced particles. The results also indicate the occurrence of a nonthermal phase transition during emission of final-state hadrons. Our results on factorial correlators suggests that short-range correlations are present in the angular distribution of charged hadrons, whereas those on oscillatory moments show that such correlations are not restricted only to a few particles. In almost all cases, the simulated results fail to replicate their experimental counterparts.

2021 ◽  
Vol 2021 (5) ◽  
Author(s):  
M. Ablikim ◽  
◽  
M. N. Achasov ◽  
P. Adlarson ◽  
S. Ahmed ◽  
...  

Abstract The decays D → K−π+π+π− and D → K−π+π0 are studied in a sample of quantum-correlated $$ D\overline{D} $$ D D ¯ pairs produced through the process e+e− → ψ(3770) → $$ D\overline{D} $$ D D ¯ , exploiting a data set collected by the BESIII experiment that corresponds to an integrated luminosity of 2.93 fb−1. Here D indicates a quantum superposition of a D0 and a $$ {\overline{D}}^0 $$ D ¯ 0 meson. By reconstructing one neutral charm meson in a signal decay, and the other in the same or a different final state, observables are measured that contain information on the coherence factors and average strong-phase differences of each of the signal modes. These parameters are critical inputs in the measurement of the angle γ of the Unitarity Triangle in B− → DK− decays at the LHCb and Belle II experiments. The coherence factors are determined to be RK3π = $$ {0.52}_{-0.10}^{+0.12} $$ 0.52 − 0.10 + 0.12 and $$ {R}_{K{\pi \pi}^0} $$ R K ππ 0 = 0.78 ± 0.04, with values for the average strong-phase differences that are $$ {\delta}_D^{K3\pi }=\left({167}_{-19}^{+31}\right){}^{\circ} $$ δ D K 3 π = 167 − 19 + 31 ° and $$ {\delta}_D^{K{\pi \pi}^0}=\left({196}_{-15}^{+14}\right){}^{\circ} $$ δ D K ππ 0 = 196 − 15 + 14 ° , where the uncertainties include both statistical and systematic contributions. The analysis is re-performed in four bins of the phase-space of the D → K−π+π+π− to yield results that will allow for a more sensitive measurement of γ with this mode, to which the BESIII inputs will contribute an uncertainty of around 6°.


Author(s):  
M D MacNeil ◽  
J W Buchanan ◽  
M L Spangler ◽  
E Hay

Abstract The objective of this study was to evaluate the effects of various data structures on the genetic evaluation for the binary phenotype of reproductive success. The data were simulated based on an existing pedigree and an underlying fertility phenotype with a heritability of 0.10. A data set of complete observations was generated for all cows. This data set was then modified mimicking the culling of cows when they first failed to reproduce, cows having a missing observation at either their second or fifth opportunity to reproduce as if they had been selected as donors for embryo transfer, and censoring records following the sixth opportunity to reproduce as in a cull-for-age strategy. The data were analyzed using a third order polynomial random regression model. The EBV of interest for each animal was the sum of the age-specific EBV over the first 10 observations (reproductive success at ages 2-11). Thus, the EBV might be interpreted as the genetic expectation of number of calves produced when a female is given ten opportunities to calve. Culling open cows resulted in the EBV for 3 year-old cows being reduced from 8.27 ± 0.03 when open cows were retained to 7.60 ± 0.02 when they were culled. The magnitude of this effect decreased as cows grew older when they first failed to reproduce and were subsequently culled. Cows that did not fail over the 11 years of simulated data had an EBV of 9.43 ± 0.01 and 9.35 ± 0.01 based on analyses of the complete data and the data in which cows that failed to reproduce were culled, respectively. Cows that had a missing observation for their second record had a significantly reduced EBV, but the corresponding effect at the fifth record was negligible. The current study illustrates that culling and management decisions, and particularly those that impact the beginning of the trajectory of sustained reproductive success, can influence both the magnitude and accuracy of resulting EBV.


2021 ◽  
Vol 4 (1) ◽  
pp. 251524592095492
Author(s):  
Marco Del Giudice ◽  
Steven W. Gangestad

Decisions made by researchers while analyzing data (e.g., how to measure variables, how to handle outliers) are sometimes arbitrary, without an objective justification for choosing one alternative over another. Multiverse-style methods (e.g., specification curve, vibration of effects) estimate an effect across an entire set of possible specifications to expose the impact of hidden degrees of freedom and/or obtain robust, less biased estimates of the effect of interest. However, if specifications are not truly arbitrary, multiverse-style analyses can produce misleading results, potentially hiding meaningful effects within a mass of poorly justified alternatives. So far, a key question has received scant attention: How does one decide whether alternatives are arbitrary? We offer a framework and conceptual tools for doing so. We discuss three kinds of a priori nonequivalence among alternatives—measurement nonequivalence, effect nonequivalence, and power/precision nonequivalence. The criteria we review lead to three decision scenarios: Type E decisions (principled equivalence), Type N decisions (principled nonequivalence), and Type U decisions (uncertainty). In uncertain scenarios, multiverse-style analysis should be conducted in a deliberately exploratory fashion. The framework is discussed with reference to published examples and illustrated with the help of a simulated data set. Our framework will help researchers reap the benefits of multiverse-style methods while avoiding their pitfalls.


2020 ◽  
Vol 2020 (12) ◽  
Author(s):  
Roberto Mondini ◽  
Ulrich Schubert ◽  
Ciaran Williams

Abstract In this paper we present a fully-differential calculation for the contributions to the partial widths H →$$ b\overline{b} $$ b b ¯ and H →$$ c\overline{c} $$ c c ¯ that are sensitive to the top quark Yukawa coupling yt to order $$ {\alpha}_s^3 $$ α s 3 . These contributions first enter at order $$ {\alpha}_s^2 $$ α s 2 through terms proportional to ytyq (q = b, c). At order $$ {\alpha}_s^3 $$ α s 3 corrections to the mixed terms are present as well as a new contribution proportional to $$ {y}_t^2 $$ y t 2 . Our results retain the mass of the final-state quarks throughout, while the top quark is integrated out resulting in an effective field theory (EFT). Our results are implemented into a Monte Carlo code allowing for the application of arbitrary final-state selection cuts. As an example we present differential distributions for observables in the Higgs boson rest frame using the Durham jet clustering algorithm. We find that the total impact of the top-induced (i.e. EFT) pieces is sensitive to the nature of the final-state cuts, particularly b-tagging and c-tagging requirements. For bottom quarks, the EFT pieces contribute to the total width (and differential distributions) at around the percent level. The impact is much bigger for the H →$$ c\overline{c} $$ c c ¯ channel, with effects as large as 15%. We show however that their impact can be significantly reduced by the application of jet-tagging selection cuts.


2008 ◽  
Vol 20 (5) ◽  
pp. 1211-1238 ◽  
Author(s):  
Gaby Schneider

Oscillatory correlograms are widely used to study neuronal activity that shows a joint periodic rhythm. In most cases, the statistical analysis of cross-correlation histograms (CCH) features is based on the null model of independent processes, and the resulting conclusions about the underlying processes remain qualitative. Therefore, we propose a spike train model for synchronous oscillatory firing activity that directly links characteristics of the CCH to parameters of the underlying processes. The model focuses particularly on asymmetric central peaks, which differ in slope and width on the two sides. Asymmetric peaks can be associated with phase offsets in the (sub-) millisecond range. These spatiotemporal firing patterns can be highly consistent across units yet invisible in the underlying processes. The proposed model includes a single temporal parameter that accounts for this peak asymmetry. The model provides approaches for the analysis of oscillatory correlograms, taking into account dependencies and nonstationarities in the underlying processes. In particular, the auto- and the cross-correlogram can be investigated in a joint analysis because they depend on the same spike train parameters. Particular temporal interactions such as the degree to which different units synchronize in a common oscillatory rhythm can also be investigated. The analysis is demonstrated by application to a simulated data set.


2004 ◽  
Vol 2004 (8) ◽  
pp. 421-429 ◽  
Author(s):  
Souad Assoudou ◽  
Belkheir Essebbar

This note is concerned with Bayesian estimation of the transition probabilities of a binary Markov chain observed from heterogeneous individuals. The model is founded on the Jeffreys' prior which allows for transition probabilities to be correlated. The Bayesian estimator is approximated by means of Monte Carlo Markov chain (MCMC) techniques. The performance of the Bayesian estimates is illustrated by analyzing a small simulated data set.


2021 ◽  
Vol 29 ◽  
pp. 287-295
Author(s):  
Zhiming Zhou ◽  
Haihui Huang ◽  
Yong Liang

BACKGROUND: In genome research, it is particularly important to identify molecular biomarkers or signaling pathways related to phenotypes. Logistic regression model is a powerful discrimination method that can offer a clear statistical explanation and obtain the classification probability of classification label information. However, it is unable to fulfill biomarker selection. OBJECTIVE: The aim of this paper is to give the model efficient gene selection capability. METHODS: In this paper, we propose a new penalized logsum network-based regularization logistic regression model for gene selection and cancer classification. RESULTS: Experimental results on simulated data sets show that our method is effective in the analysis of high-dimensional data. For a large data set, the proposed method has achieved 89.66% (training) and 90.02% (testing) AUC performances, which are, on average, 5.17% (training) and 4.49% (testing) better than mainstream methods. CONCLUSIONS: The proposed method can be considered a promising tool for gene selection and cancer classification of high-dimensional biological data.


Genetics ◽  
2001 ◽  
Vol 157 (3) ◽  
pp. 1369-1385 ◽  
Author(s):  
Z W Luo ◽  
C A Hackett ◽  
J E Bradshaw ◽  
J W McNicol ◽  
D Milbourne

Abstract This article presents methodology for the construction of a linkage map in an autotetraploid species, using either codominant or dominant molecular markers scored on two parents and their full-sib progeny. The steps of the analysis are as follows: identification of parental genotypes from the parental and offspring phenotypes; testing for independent segregation of markers; partition of markers into linkage groups using cluster analysis; maximum-likelihood estimation of the phase, recombination frequency, and LOD score for all pairs of markers in the same linkage group using the EM algorithm; ordering the markers and estimating distances between them; and reconstructing their linkage phases. The information from different marker configurations about the recombination frequency is examined and found to vary considerably, depending on the number of different alleles, the number of alleles shared by the parents, and the phase of the markers. The methods are applied to a simulated data set and to a small set of SSR and AFLP markers scored in a full-sib population of tetraploid potato.


2021 ◽  
Vol 70 (10) ◽  
Author(s):  
Kazuyoshi Gotoh ◽  
Makoto Miyoshi ◽  
I Putu Bayu Mayura ◽  
Koji Iio ◽  
Osamu Matsushita ◽  
...  

The options available for treating infections with carbapenemase-producing Enterobacteriaceae (CPE) are limited; with the increasing threat of these infections, new treatments are urgently needed. Biapenem (BIPM) is a carbapenem, and limited data confirming its in vitro killing effect against CPE are available. In this study, we examined the minimum inhibitory concentrations (MICs) and minimum bactericidal concentrations (MBCs) of BIPM for 14 IMP-1-producing Enterobacteriaceae strains isolated from the Okayama region in Japan. The MICs against almost all the isolates were lower than 0.5 µg ml−1, indicating susceptibility to BIPM, while approximately half of the isolates were confirmed to be bacteriostatic to BIPM. However, initial killing to a 99.9 % reduction was observed in seven out of eight strains in a time–kill assay. Despite the small data set, we concluded that the in vitro efficacy of BIPM suggests that the drug could be a new therapeutic option against infection with IMP-producing CPE.


Author(s):  
Valentin Raileanu ◽  

The article briefly describes the history and fields of application of the theory of extreme values, including climatology. The data format, the Generalized Extreme Value (GEV) probability distributions with Bock Maxima, the Generalized Pareto (GP) distributions with Point of Threshold (POT) and the analysis methods are presented. Estimating the distribution parameters is done using the Maximum Likelihood Estimation (MLE) method. Free R software installation, the minimum set of required commands and the GUI in2extRemes graphical package are described. As an example, the results of the GEV analysis of a simulated data set in in2extRemes are presented.


Sign in / Sign up

Export Citation Format

Share Document