scholarly journals An entity relationship approach to the event and detector monte carlo simulation in high-energy physics

1992 ◽  
Vol 105 (4) ◽  
pp. 595-602
Author(s):  
L. Cifarelli ◽  
G. La Commare ◽  
M. Marino
Author(s):  
Daniele Andreotti ◽  
Armando Fella ◽  
Eleonora Luppi

The BaBar experiment uses data since 1999 in examining the violation of charge and parity (CP) symmetry in the field of high energy physics. This event simulation experiment is a compute intensive task due to the complexity of the Monte-Carlo simulation implemented on the GEANT engine. Data needed as input for the simulation (stored in the ROOT format), are classified into two categories: conditions data for describing the detector status when data are recorded, and background triggers data for noise signal necessary to obtain a realistic simulation. In this chapter, the grid approach is applied to the BaBar production framework using the INFN-GRID network.


2014 ◽  
Vol 2014 ◽  
pp. 1-13 ◽  
Author(s):  
Florin Pop

Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.


Author(s):  
Manuel Alejandro Segura ◽  
Julian Salamanca ◽  
Edwin Munevar

Specialized documentation envisioned from a pedagogical bases to train scientifically and technologically teachers and researchers, who initiate themselves in the analysis of high energy physics (HEP) experiments, is scarce. The lack of this material makes that young scientists' learning process be prolonged in time, raising costs in experimental research. In this paper we present the Monte Carlo technique applied to simulate the threshold energy for producing final-state particles of a specific two-body process (A + B → C + D), as pedagogical environment to face both computationally and conceptually an experimental analysis. The active/interactive learning-teaching formative process presented here is expected to be an educational resource for reducing young scientists' learning curve and saving time and costs in HEP scientific research.


Author(s):  
Jack Singal ◽  
J. Brian Langton ◽  
Rafe Schindler

We discuss a novel use of the Geant4 simulation toolkit to model molecular transport in a vacuum environment, in the molecular flow regime. The Geant4 toolkit was originally developed by the high energy physics community to simulate the interactions of elementary particles within complex detector systems. Here its capabilities are utilized to model molecular vacuum transport in geometries where other techniques are impractical. The techniques are verified with an application representing a simple vacuum geometry that has been studied previously both analytically and by basic Monte Carlo simulation. We discuss the use of an application with a very complicated geometry, that of the Large Synoptic Survey Telescope camera cryostat, to determine probabilities of transport of contaminant molecules to optical surfaces where control of contamination is crucial.


2005 ◽  
Vol 20 (16) ◽  
pp. 3880-3882 ◽  
Author(s):  
DANIEL WICKE

The DØ experiment faces many challenges in terms of enabling access to large datasets for physicists on four continents. The strategy for solving these problems on worldwide distributed computing clusters is presented. Since the beginning of Run II of the Tevatron (March 2001) all Monte-Carlo simulations for the experiment have been produced at remote systems. For data analysis, a system of regional analysis centers (RACs) was established which supply the associated institutes with the data. This structure, which is similar to the tiered structure foreseen for the LHC was used in Fall 2003 to reprocess all DØ data with a much improved version of the reconstruction software. This makes DØ the first running experiment that has implemented and operated all important computing tasks of a high energy physics experiment on systems distributed worldwide.


Sign in / Sign up

Export Citation Format

Share Document