scholarly journals Radioactive Source Localisation via Projective Linear Reconstruction

Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 807
Author(s):  
Samuel R. White ◽  
Kieran T. Wood ◽  
Peter G. Martin ◽  
Dean T. Connor ◽  
Thomas B. Scott ◽  
...  

Radiation mapping, through the detection of ionising gamma-ray emissions, is an important technique used across the nuclear industry to characterise environments over a range of length scales. In complex scenarios, the precise localisation and activity of radiological sources becomes difficult to determine due to the inability to directly image gamma photon emissions. This is a result of the potentially unknown number of sources combined with uncertainties associated with the source-detector separation—causing an apparent ‘blurring’ of the as-detected radiation field relative to the true distribution. Accurate delimitation of distinct sources is important for decommissioning, waste processing, and homeland security. Therefore, methods for estimating the precise, ‘true’ solution from radiation mapping measurements are required. Herein is presented a computational method of enhanced radiological source localisation from scanning survey measurements conducted with a robotic arm. The procedure uses an experimentally derived Detector Response Function (DRF) to perform a randomised-Kaczmarz deconvolution from robotically acquired radiation field measurements. The performance of the process is assessed on radiation maps obtained from a series of emulated waste processing scenarios. The results demonstrate a Projective Linear Reconstruction (PLR) algorithm can successfully locate a series of point sources to within 2 cm of the true locations, corresponding to resolution enhancements of between 5× and 10×.

1994 ◽  
Vol 38 (01) ◽  
pp. 42-51
Author(s):  
Kwang June Bai ◽  
Jae Hoon Han

An application is described of the localized finite-element method to a steady nonlinear free-surface flow past a submerged two-dimensional hydrofoil at an arbitrary angle of attack. The earlier investigations with the linear free-surface boundary condition have shown some disagreement between the computed results and the experimental measurements for the cases of shallow submergence. The aim of this paper is to investigate the effect of the nonlinear free-surface condition for the cases where the linear results show disagreement with the experimental measurements. The computational method of solution is the localized finite-element method based on the classical Hamilton's principle. In the present study, a notable step is introduced in the matching procedure between the fully nonlinear and the linear subdomains. The numerical results of wave resistance, lift force, and circulation strength are presented. The computed pressure distributions on the hydrofoil and wave profiles are shown and compared with the experimental measurements and also with the linear computational results. The present computed results show better agreement with the experimental results. In some cases, however, a difficulty in the convergence of the iterative solution procedure was experienced. This difficulty in the convergence may be due to the limit of the range of the existence of the true solution in potential-flow formulation.


1964 ◽  
Vol 14 (6) ◽  
pp. 603-605
Author(s):  
O. I. Leipunskii ◽  
L. R. Kimel' ◽  
A. M. Panchenko

Author(s):  
Philippe Z Yao ◽  
Jason Dexter ◽  
Alexander Y Chen ◽  
Benjamin R Ryan ◽  
George N Wong

Abstract We use the public code ebhlight to carry out 3D radiative general relativistic magnetohydrodynamics (GRMHD) simulations of accretion on to the supermassive black hole in M87. The simulations self-consistently evolve a frequency-dependent Monte Carlo description of the radiation field produced by the accretion flow. We explore two limits of accumulated magnetic flux at the black hole (SANE and MAD), each coupled to several sub-grid prescriptions for electron heating that are motivated by models of turbulence and magnetic reconnection. We present convergence studies for the radiation field and study its properties. We find that the near-horizon photon energy density is an order of magnitude higher than is predicted by simple isotropic estimates from the observed luminosity. The radially dependent photon momentum distribution is anisotropic and can be modeled by a set of point-sources near the equatorial plane. We draw properties of the radiation and magnetic field from the simulation and feed them into an analytic model of gap acceleration to estimate the very high energy (VHE) gamma-ray luminosity from the magnetized jet funnel, assuming that a gap is able to form. We find luminosities of $\rm \sim 10^{41} \, erg \, s^{-1}$ for MAD models and $\rm \sim 2\times 10^{40} \, erg \, s^{-1}$ for SANE models, which are comparable to measurements of M87’s VHE flares. The time-dependence seen in our calculations is insufficient to explain the flaring behaviour. Our results provide a step towards bridging theoretical models of near-horizon properties seen in black hole images with the VHE activity of M87.


Author(s):  
Iran Hassanzadeh ◽  
Armin Allahverdy ◽  
Okhtay Jahanbakhsh ◽  
Alireza Khorrami Moghaddam

Purpose: In some gamma spectroscopy experiments, neutrons may also be present, so depending on experimental conditions, Gamma spectroscopy can be influenced by the presence of neutrons. Materials and Methods: In this study, a NaI(Tl)(63 mm×63 mm) detector is used to investigate the effects of fast neutrons on the spectrum of gamma photons. The radiation source used in these experiments is made up of two point sources: an AmBe (50 mCi) neutron source and a 137Cs(10 mCi) gamma source. Results: Results were determined through both measurements and Monte Carlo simulation (MCNPX) under two different experimental conditions and were compared. When the detector is placed under an angle to the source, gamma photon energy peaks resulting from inelastic interactions of the fast neutrons with the detector materials and surrounding materials in the energy range of 0.1-0.9(MeV) are pretty visible in the gamma main spectrum. These results can be used to optimize industrial tomography experiments carried out with NaI(Tl) scintillators. Conclusion: Also, the results show that the detection of fast neutrons with a NaI(Tl) scintillator is possible with low efficiency.  


Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5732
Author(s):  
Graeme Turkington ◽  
Kelum A. A. Gamage ◽  
James Graham

The in-situ characterisation of strontium-90 contamination of groundwater at nuclear decommissioning sites would represent a novel and cost-saving technology for the nuclear industry. However, beta particles are emitted over a continuous spectrum and it is difficult identify radionuclides due to the overlap of their spectra and the lack of characteristic features. This can be resolved by using predictive modelling to perform a maximum-likelihood estimation of the radionuclides present in a beta spectrum obtained with a semiconductor detector. This is achieved using a linear least squares linear regression and relating experimental data with simulated detector response data. In this case, by simulating a groundwater borehole scenario and the deployment of a cadmium telluride detector within it, it is demonstrated that it is possible to identify the presence of 90Sr, 90Y, 137Cs and 235U decay. It is determined that the optimal thickness of the CdTe detector for this technique is in the range of 0.1 to 1 mm. The influence of suspended solids in the groundwater is also investigated. The average and maximum concentrations of suspended particles found at Sellafield do not significantly deteriorate the results. It is found that applying the linear regression over two energy windows improves the estimate of 90Sr activity in a mixed groundwater source. These results provide validation for the ability of in-situ detectors to determine the activity of 90Sr in groundwater in a timely and cost-effective manner.


Author(s):  
James McKinney ◽  
Melanie Brownridge

NDA has a responsibility to ensure decommissioning activities are sufficiently technically underpinned and appropriate Research and Development (R&D) is carried out. The NDA funds research and development (R&D) indirectly via the Site Licence Companies (SLCs) or directly. The main component of directly funded R&D is the NDA Direct Research Portfolio (DRP). The DRP is split into four framework areas: • University Interactions; • Waste Processing; • Material Characterisation; • Actinide and Strategic Nuclear Materials. These four framework areas were competed through an Official Journal of European Union (OJEU) process in 2008. Although all four areas involve waste management, Waste Processing and Material Characterisation specifically deal with Higher Activity Waste (HAW) waste management issues. The Waste Processing area was awarded to three groups: (i) National Nuclear Laboratory (NNL), (ii) Consortium led by Hyder Consulting Ltd, and (iii) Consortium led by UKAEA Ltd. The Material Characterisation area was awarded to three groups: (i) NNL, (ii) Serco, and (iii) Consortium led by UKAEA Ltd. The initial work in Waste Processing and Material Characterisation was centred on establishing a forward research programme to address the generic needs of the UK civil nuclear industry and the NDA strategic drivers for waste management and land quality. This has been achieved by the four main framework contractors from the Waste Processing and Materials Characterisation areas working together with the NDA to identify the key research themes and begin the development of the NDA’s HAW Management Research Programme. The process also involves active engagement with both industry and regulators via the Nuclear Waste Research Forum (NWRF). The NDA’s HAW Management Research Programme includes a number of themes: • Optimisation of Interim Store Operation & Design; • Alternative Waste Encapsulants; • Waste Package Integrity; • Alternative Waste treatment methods; • Alternative storage and disposal options; • Integrated waste management solutions; • Materials characterisation. The NDA, with additional support from its framework contractors and the Nuclear Waste Research Forum, is now developing a more detailed scope for each research theme and prioritising the research projects to ensure alignment with its strategic development programme.


Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 3784 ◽  
Author(s):  
Wenrui Gao ◽  
Weidong Wang ◽  
Hongbiao Zhu ◽  
Guofu Huang ◽  
Dongmei Wu ◽  
...  

This paper addresses a detection problem where sparse measurements are utilized to estimate the source parameters in a mixed multi-modal radiation field. As the limitation of dimensional scalability and the unimodal characteristic, most existing algorithms fail to detect the multi-point sources gathered in narrow regions, especially with no prior knowledge about intensity and source number. The proposed Peak Suppressed Particle Filter (PSPF) method utilizes a hybrid scheme of multi-layer particle filter, mean-shift clustering technique and peak suppression correction to solve the major challenges faced by current existing algorithms. Firstly, the algorithm realizes sequential estimation of multi-point sources in a cross-mixed radiation field by using particle filtering and suppressing intensity peak value, while existing algorithms could just identify single point or spatially separated point sources. Secondly, the number of radioactive sources could be determined in a non-parametric manner as the fact that invalid particle swarms would disperse automatically. In contrast, existing algorithms either require prior information or rely on expensive statistic estimation and comparison. Additionally, to improve the prediction stability and convergent performance, distance correction module and configuration maintenance machine are developed to sustain the multimodal prediction stability. Finally, simulations and physical experiments are carried out in aspects such as different noise level, non-parametric property, processing time and large-scale estimation, to validate the effectiveness and robustness of the PSPF algorithm.


2001 ◽  
Vol 10 (02) ◽  
pp. 245-259 ◽  
Author(s):  
D. I. NOVIKOV ◽  
P. NASELSKY ◽  
H. E. JØRGENSEN ◽  
P. R. CHRISTENSEN ◽  
I. D. NOVIKOV ◽  
...  

We propose a power filter G p for linear reconstruction of the CMB signal from one-dimensional scans of observational maps. This G p filter preserves the power spectrum of the CMB signal in contrast to the Wiener filter which diminishes the power spectrum of the reconstructed CMB signal. We demonstrate how peak statistics and a cluster analysis can be used to estimate the probability of the presence of a CMB signal in observational records. The efficiency of the G p filter is demonstrated on a toy model of an observational record consisting of a CMB signal and noise in the form of foreground point sources.


2001 ◽  
Vol 2001 (2) ◽  
pp. 1203-1207 ◽  
Author(s):  
Charlie Henry ◽  
Paulene O. Roberts

ABSTRACT For a 6-month period between May and September 1999, the staff and crew of the National Oceanic and Atmospheric Administration (NOAA) research ship R/V Ferrel augmented their activities to support a baseline fluorometry study for NOAA's Office of Response and Restoration (OR&R). During the study period, scientific data were collected at more than 50 stations in the Atlantic Ocean and Gulf of Mexico. The study was designed, in part, to assess the Special Monitoring of Applied Response Technologies (SMART) program. SMART is a joint U.S. Coast Guard (USCG), U.S. Environmental Protection Agency (EPA), Centers for Disease Control and Prevention (CDC), and NOAA monitoring program that provides near real-time feedback to the Unified Command during dispersant applications to mitigate marine oil spills. Using fluorometers to accurately measure dispersed oil concentrations is not a trivial task. Detector response values vary because of oil composition and oil weathering; dispersed oil is not in true solution; and natural waters contribute matrix effects and background fluorescence. It was the latter two that were investigated. Because seawater is a complex mixture of dissolved chemicals, particulates, and living plants and animals, archiving samples for future analysis relative to background fluorescence is problematic. Each contributes to background fluorescence and matrix effects when dispersed oil is present in the sample. Since water samples change with storage, only near real-time analyses are valid; therefore, using an at-sea laboratory like the NOAA's Ferrel was essential to the study design. Results suggest that the adverse flouorometry effects observed for open-ocean environments were within typical quality-objective goals. Therefore, the range of seawater composition changes observed in this investigation had very little effect on the ability to detect dispersed oil and meet the SMART mission objectives.


Sign in / Sign up

Export Citation Format

Share Document