scholarly journals Verification of CENDL-3.2 Nuclear Data on VENUS-3 Shielding Benchmark by ARES Transport Code

2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Jiaju Hu ◽  
Bin Zhang ◽  
Zhiwei Zong ◽  
Cong Liu ◽  
Yixue Chen

The recently released CENDL-3.2 nuclear data library is deemed as an important achievement in the field of nuclear data research in China. To verify the applicability of the library to the shielding calculation of PWR and analyze the influence of multigroup cross-section parameters on the shielding calculation, ARES-MACXS module is used to process the MATXS format multigroup library based on CENDL-3.2 to generate multigroup working cross sections for PWR shielding calculation. VENUS-3 experimental facility has a clear and complete geometry. It is often used to test the ability of the advanced transport calculation method of calculating RPV fast neutron flux and to evaluate the accuracy of cross-section library. Different cross-section parameters are chosen for ARES to calculate VENUS-3 benchmark, and equivalent neutron flux of 58Ni(n,p)58Co, 115In(n,n′)115mIn and 27Al(n,α)24Na detectors is calculated according to the data provided by the benchmark report. The numerical results demonstrate that almost all the relative deviations between the calculated results and the experimental results are within 20%, which satisfies the requirement of shielding calculation. CENDL-3.2 is suitable for PWR shielding calculation. The comparison of various cross-section parameters results indicates that multigroup cross-section parameters have large effects on the transport calculation results.

2020 ◽  
Vol 225 ◽  
pp. 03009
Author(s):  
P. Haroková ◽  
M. Lovecký

One of the objectives of reactor dosimetry is determination of activity of irradiated dosimeters, which are placed on reactor pressure vessel surface, and calculation of neutron flux in their position. The uncertainty of calculation depends mainly on the choice of nuclear data library, especially cross section used for neutron transport and cross section used as the response function for neutron activation. Nowadays, number of libraries already exists and can be still used in some applications. In addition, new nuclear data library was recently released. In this paper, we have investigated the impact of the cross section libraries on activity of niobium, one of the popular materials used as neutron fluence monitor. For this purpose, a MCNP6 model of VVER-1000 was made and we have compared the results between 14 commonly used cross section libraries. A possibility of using IRDFF library in activation calculations was also considered. The results show good agreement between the new libraries, with the exception of the most recent ENDF/B-VIII.0, which should be further validated.


2018 ◽  
Vol 4 ◽  
pp. 10 ◽  
Author(s):  
Guillaume Ritter ◽  
Romain Eschbach ◽  
Richard Girieud ◽  
Maxime Soulard

CESAR stands in French for “simplified depletion applied to reprocessing”. The current version is now number 5.3 as it started 30 years ago from a long lasting cooperation with ORANO, co-owner of the code with CEA. This computer code can characterize several types of nuclear fuel assemblies, from the most regular PWR power plants to the most unexpected gas cooled and graphite moderated old timer research facility. Each type of fuel can also include numerous ranges of compositions like UOX, MOX, LEU or HEU. Such versatility comes from a broad catalog of cross section libraries, each corresponding to a specific reactor and fuel matrix design. CESAR goes beyond fuel characterization and can also provide an evaluation of structural materials activation. The cross-sections libraries are generated using the most refined assembly or core level transport code calculation schemes (CEA APOLLO2 or ERANOS), based on the European JEFF3.1.1 nuclear data base. Each new CESAR self shielded cross section library benefits all most recent CEA recommendations as for deterministic physics options. Resulting cross sections are organized as a function of burn up and initial fuel enrichment which allows to condensate this costly process into a series of Legendre polynomials. The final outcome is a fast, accurate and compact CESAR cross section library. Each library is fully validated, against a stochastic transport code (CEA TRIPOLI 4) if needed and against a reference depletion code (CEA DARWIN). Using CESAR does not require any of the neutron physics expertise implemented into cross section libraries generation. It is based on top quality nuclear data (JEFF3.1.1 for ∼400 isotopes) and includes up to date Bateman equation solving algorithms. However, defining a CESAR computation case can be very straightforward. Most results are only 3 steps away from any beginner's ambition: Initial composition, in core depletion and pool decay scenario. On top of a simple utilization architecture, CESAR includes a portable Graphical User Interface which can be broadly deployed in R&D or industrial facilities. Aging facilities currently face decommissioning and dismantling issues. This way to the end of the nuclear fuel cycle requires a careful assessment of source terms in the fuel, core structures and all parts of a facility that must be disposed of with “industrial nuclear” constraints. In that perspective, several CESAR cross section libraries were constructed for early CEA Research and Testing Reactors (RTR’s). The aim of this paper is to describe how CESAR operates and how it can be used to help these facilities care for waste disposal, nuclear materials transport or basic safety cases. The test case will be based on the PHEBUS Facility located at CEA − Cadarache.


2020 ◽  
Vol 239 ◽  
pp. 09001
Author(s):  
Zhigang Ge ◽  
Ruirui Xu ◽  
Haicheng Wu ◽  
Yue Zhang ◽  
Guochang Chen ◽  
...  

A new version of Chinese Evaluated Nuclear Data Library, namely CENDL-3.2, has been completed under the joint efforts of CENDL working group. This library is constructed with the general purpose to provide high-quality nuclear data for the modern nuclear science and engineering. 272 nuclides from light to heavy are covered in CENDL-3.2 in total and the data for 134 nuclides are new or updated evaluations in energy region of 10-5 eV-20 MeV. The data of most of the key nuclides in nuclear application like U, Pu, Th, Fe et al. have been revised and improved, and various evaluation techniques have been developed to produce the nuclear data with good quality. Moreover, model dependent covariances data for main reaction cross sections are added for 70 fission product nuclides. To assess the accuracy of CENDL-3.2 in application, the data have been tested with the criticality and shielding benchmarks collected in ENDITS-1.0.


2021 ◽  
Vol 247 ◽  
pp. 15003
Author(s):  
G. Valocchi ◽  
P. Archier ◽  
J. Tommasi

In this paper, we present a sensitivity analysis of the beta effective to nuclear data for the UM17x17 experiment that has been performed in the EOLE reactor. This work is carried out using the APOLLO3® platform. Regarding the flux calculation, the standard two-step approach (lattice/core) is used. For what concerns the delayed nuclear data, they are processed to be directly used in the core calculation without going through the lattice one. We use the JEFF-3.1.1 nuclear data library for cross-sections and delayed data. The calculation of k-effective and beta effective is validated against a TRIPOLI4® one while the main sensitivities are validated against direct calculation. Finally, uncertainty propagation is performed using the COMAC-V2.0 covariance library.


2020 ◽  
Vol 29 (08) ◽  
pp. 2050052
Author(s):  
Dashty T. Akrawy ◽  
Ali H. Ahmed ◽  
E. Tel ◽  
A. Aydin ◽  
L. Sihver

An empirical formula to calculate the ([Formula: see text], [Formula: see text] reaction cross-sections for 14.5[Formula: see text]MeV neutrons for 183 target nuclei in the range [Formula: see text] is presented. Evaluated cross-section data from TENDL nuclear data library were used to test and benchmark the formula. In this new formula, the nonelastic cross-section term is replaced by the atomic number [Formula: see text], while the asymmetry parameter-dependent exponential term has been retained. The calculated results are presented in comparison with the seven previously published formulae. We show that the new formula is significantly in better agreement with the measured values compared to previously published formulae.


1999 ◽  
Vol 71 (12) ◽  
pp. 2309-2315 ◽  
Author(s):  
N. E. Holden

The Westcott g-factors, which allow the user to determine reaction rates for nuclear reactions taking place at various temperatures, have been calculated using data from the Evaluated Neutron Nuclear Data Library, ENDF/B-VI. Nuclides chosen have g-factors which are significantly different from unity and result in different reaction rates compared to nuclides whose neutron capture cross section varies as the reciprocal of the neutron velocity. Values are presented as a function of temperature up to 673.16 K (400 °C).


2020 ◽  
Vol 239 ◽  
pp. 22008
Author(s):  
Eliot Party ◽  
Xavier Doligez ◽  
Philippe Dessagne ◽  
Maëlle Kerveno ◽  
Greg Henning

This paper shows how Total Monte Carlo (TMC) method and Perturbation Theory (PT) can be applied to quantify uncertainty due to nuclear data on reactor static calculations of integral parameters such as keff and βeff. This work focuses on thorium fueled reactors and it aims to rank different cross sections uncertainty regarding criticality calculations. The consistency of the two methods are first studied. The cross sections set used for the TMC method is computed to build adequate correlation matrices. Those matrices are then multiplied by the sensitivity coefficients obtained thanks to the PT to obtain global uncertainties that are compared to the ones calculated by the TMC method. Results in good agreement allow us to use correlation matrix from the state of the art nuclear data library (JEFF 3-3) that provide insight of uncertainty on keff and βeff for thorium fueled Pressurized Water Reactors. Finally, maximum uncertainties on cross sections are estimated to reach a target uncertainty on integral parameters. It is shown that a strong reduction of the current uncertainty is needed and consequently, new measurements and evaluations have to be performed.


Sign in / Sign up

Export Citation Format

Share Document