scholarly journals Evaluation of Cosmic Ray Rejection Algorithms on Single-Shot Exposures

2005 ◽  
Vol 22 (3) ◽  
pp. 249-256 ◽  
Author(s):  
Catherine L. Farage ◽  
Kevin A. Pimbblet

AbstractTo maximise data output from single-shot astronomical images, the rejection of cosmic rays is important. We present the results of a benchmark trial comparing various cosmic ray rejection algorithms. The procedures assess relative performances and characteristics of the processes in cosmic ray detection, rates of false detections of true objects, and the quality of image cleaning and reconstruction. The cosmic ray rejection algorithms developed by Rhoads (2000, PASP, 112, 703), van Dokkum (2001, PASP, 113, 1420), Pych (2004, PASP, 116, 148), and the IRAF task XZAP by Dickinson are tested using both simulated and real data. It is found that detection efficiency is independent of the density of cosmic rays in an image, being more strongly affected by the density of real objects in the field. As expected, spurious detections and alterations to real data in the cleaning process are also significantly increased by high object densities. We find the Rhoads' linear filtering method to produce the best performance in the detection of cosmic ray events; however, the popular van Dokkum algorithm exhibits the highest overall performance in terms of detection and cleaning.

2021 ◽  
Author(s):  
Kseniia Golubenko ◽  
Eugene Rozanov ◽  
Gennady Kovaltsov ◽  
Ari-Pekka Leppänen ◽  
Timofei Sukhodolov ◽  
...  

Abstract. Short-living cosmogenic isotope 7Be, produced by cosmic rays in the atmosphere, is often used as a probe for atmospheric dynamics. Previously, modelling of the beryllium atmospheric transport was performed using simplified box-models or air back-tracing codes. While the ability of full atmospheric dynamics models to model beryllium transport was demonstrated earlier, no such ready-to-use model is currently available. Here we present the chemistry-climate model SOCOL-AERv2-BEv1 to trace isotopes of beryllium in the atmosphere. The SOCOL (SOlar Climate Ozone Links) model has been improved by including modules for the production, deposition, and transport of beryllium. Production was modelled considering both galactic and solar cosmic rays, by applying the CRAC (Cosmic-Ray induced Atmospheric Cascade) model. Radioactive decay of 7Be was explicitly taken into account. Beryllium transport was modelled without additional gravitational settling due to the small size of the background aerosol particles. An interactive deposition scheme was applied including both wet and dry depositions. The modelling was performed, using a full nudging to the meteorological fields, for the period of 2003–2008 with a spin-up period of 1996–2002. The modelled concentrations of 7Be in near-ground air were compared with the measured, at a weekly cadence, ones in four nearly antipodal high-latitude locations, two in Northern (Finland and Canada) and two in Southern (Chile and Kerguelen Island) hemispheres. The model results agree with the measurements in the absolute level within error bars, implying that the production, decay and lateral deposition are correctly reproduced by the model. The model also correctly reproduces the temporal variability of 7Be concentrations on the annual and sub-annual scales, including a perfect reproduction of the annual cycle, dominating data in the Northern hemisphere. We also modelled the production and transport of 7Be for a major solar energetic-particle event of 20-Jan-2005. Concluding, a new full 3D time-dependent model, based on the SOCOL-AERv2, of beryllium atmospheric production, transport and deposition has been developed. Comparison with the real data of 7Be concentration in the near-ground air fully validates the model and its high accuracy.


2020 ◽  
Vol 634 ◽  
pp. A48
Author(s):  
M. Paillassa ◽  
E. Bertin ◽  
H. Bouy

In this work, we propose two convolutional neural network classifiers for detecting contaminants in astronomical images. Once trained, our classifiers are able to identify various contaminants, such as cosmic rays, hot and bad pixels, persistence effects, satellite or plane trails, residual fringe patterns, nebulous features, saturated pixels, diffraction spikes, and tracking errors in images. They encompass a broad range of ambient conditions, such as seeing, image sampling, detector type, optics, and stellar density. The first classifier, MAXIMASK, performs semantic segmentation and generates bad pixel maps for each contaminant, based on the probability that each pixel belongs to a given contaminant class. The second classifier, MAXITRACK, classifies entire images and mosaics, by computing the probability for the focal plane to be affected by tracking errors. We gathered training and testing data from real data originating from various modern charged-coupled devices and near-infrared cameras, that are augmented with image simulations. We quantified the performance of both classifiers and show that MAXIMASK achieves state-of-the-art performance for the identification of cosmic ray hits. Thanks to a built-in Bayesian update mechanism, both classifiers can be tuned to meet specific science goals in various observational contexts.


2019 ◽  
Vol 73 (9) ◽  
pp. 1019-1027 ◽  
Author(s):  
Kyle Uckert ◽  
Rohit Bhartia ◽  
John Michel

Cosmic rays can degrade Raman hyperspectral images by introducing high-intensity noise to spectra, obfuscating the results of downstream analyses. We describe a novel method to detect cosmic rays in deep ultraviolet Raman hyperspectral data sets adapted from existing cosmic ray removal methods applied to astronomical images. This method identifies cosmic rays as outliers in the distribution of intensity values in each wavelength channel. In some cases, this algorithm fails to identify cosmic rays in data sets with high inter-spectral variance, uncorrected baseline drift, or few spectra. However, this method effectively identifies cosmic rays in spatially uncorrelated hyperspectral data sets more effectively than other cosmic ray rejection methods and can potentially be employed in commercial and robotic Raman systems to identify cosmic rays semi-autonomously.


1996 ◽  
Vol 33 (9) ◽  
pp. 101-108 ◽  
Author(s):  
Agnès Saget ◽  
Ghassan Chebbo ◽  
Jean-Luc Bertrand-Krajewski

The first flush phenomenon of urban wet weather discharges is presently a controversial subject. Scientists do not agree with its reality, nor with its influences on the size of treatment works. Those disagreements mainly result from the unclear definition of the phenomenon. The objective of this article is first to provide a simple and clear definition of the first flush and then to apply it to real data and to obtain results about its appearance frequency. The data originate from the French database based on the quality of urban wet weather discharges. We use 80 events from 7 separately sewered basins, and 117 events from 7 combined sewered basins. The main result is that the first flush phenomenon is very scarce, anyway too scarce to be used to elaborate a treatment strategy against pollution generated by urban wet weather discharges.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Linghui Liang ◽  
Feng Qi ◽  
Yifei Cheng ◽  
Lei Zhang ◽  
Dongliang Cao ◽  
...  

AbstractTo analyze the clinical characteristics of patients with negative biparametric magnetic resonance imaging (bpMRI) who didn’t need prostate biopsies (PBs). A total of 1,012 male patients who underwent PBs in the First Affiliated Hospital of Nanjing Medical University from March 2018 to November 2019, of 225 had prebiopsy negative bpMRI (defined as Prostate Imaging Reporting and Data System (PI-RADS 2.1) score less than 3). The detection efficiency of clinically significant prostate cancer (CSPCa) was assessed according to age, digital rectal examination (DRE), prostate volume (PV) on bpMRI, prostate-specific antigen (PSA) and PSA density (PSAD). The definition of CSPCa for Gleason score > 6. Univariate and multivariable logistic regression analysis were used to identify predictive factors of absent CSPCa on PBs. Moreover, absent CSPCa contained clinically insignificant prostate cancer (CIPCa) and benign result. The detection rates of present prostate cancer (PCa) and CSPCa were 27.11% and 16.44%, respectively. Patients who were diagnosed as CSPCa had an older age (P < 0.001), suspicious DRE (P < 0.001), a smaller PV (P < 0.001), higher PSA value (P = 0.008) and higher PSAD (P < 0.001) compared to the CIPCa group and benign result group. PSAD < 0.15 ng/ml/cm3 (P = 0.004) and suspicious DRE (P < 0.001) were independent predictors of absent CSPCa on BPs. The negative forecast value of bpMRI for BP detection of CSPCa increased with decreasing PSAD, mainly in patients with naive PB (P < 0.001) but not in prior negative PB patients. 25.33% of the men had the combination of negative bpMRI, PSAD < 0.15 ng/ml/cm3 and PB naive, and none had CSPCa on repeat PBs. The incidence of PB was determined, CSPCa was 1.59%, 0% and 16.67% in patients with negative bpMRI and PSAD < 0.15 ng/ml/cm3, patients with negative bpMRI, PSAD < 0.15 ng/ml/cm3 and biopsy naive and patients with negative bpMRI, PSAD < 0.15 ng/ml/cm3 and prior negative PB, separately. We found that a part of patients with negative bpMRI, a younger age, no suspicious DRE and PSAD < 0.15 ng/ml/cm3 may securely avoid PBs. Conversely PB should be considered in patients regardless of negative bpMRI, especially who with a greater age, obviously suspicious DRE, significantly increased PSA value, a significantly small PV on MRI and PSAD > 0.15 ng/ml/cm3.


2021 ◽  
Vol 15 (4) ◽  
pp. 1-20
Author(s):  
Georg Steinbuss ◽  
Klemens Böhm

Benchmarking unsupervised outlier detection is difficult. Outliers are rare, and existing benchmark data contains outliers with various and unknown characteristics. Fully synthetic data usually consists of outliers and regular instances with clear characteristics and thus allows for a more meaningful evaluation of detection methods in principle. Nonetheless, there have only been few attempts to include synthetic data in benchmarks for outlier detection. This might be due to the imprecise notion of outliers or to the difficulty to arrive at a good coverage of different domains with synthetic data. In this work, we propose a generic process for the generation of datasets for such benchmarking. The core idea is to reconstruct regular instances from existing real-world benchmark data while generating outliers so that they exhibit insightful characteristics. We propose and describe a generic process for the benchmarking of unsupervised outlier detection, as sketched so far. We then describe three instantiations of this generic process that generate outliers with specific characteristics, like local outliers. To validate our process, we perform a benchmark with state-of-the-art detection methods and carry out experiments to study the quality of data reconstructed in this way. Next to showcasing the workflow, this confirms the usefulness of our proposed process. In particular, our process yields regular instances close to the ones from real data. Summing up, we propose and validate a new and practical process for the benchmarking of unsupervised outlier detection.


1970 ◽  
Vol 39 ◽  
pp. 168-183
Author(s):  
E. N. Parker

The topic of this presentation is the origin and dynamical behavior of the magnetic field and cosmic-ray gas in the disk of the Galaxy. In the space available I can do no more than mention the ideas that have been developed, with but little explanation and discussion. To make up for this inadequacy I have tried to give a complete list of references in the written text, so that the interested reader can pursue the points in depth (in particular see the review articles Parker, 1968a, 1969a, 1970). My purpose here is twofold, to outline for you the calculations and ideas that have developed thus far, and to indicate the uncertainties that remain. The basic ideas are sound, I think, but, when we come to the details, there are so many theoretical alternatives that need yet to be explored and so much that is not yet made clear by observations.


2019 ◽  
Vol 5 (9) ◽  
pp. eaax3793 ◽  
Author(s):  
◽  
Q. An ◽  
R. Asfandiyarov ◽  
P. Azzarello ◽  
P. Bernardini ◽  
...  

The precise measurement of the spectrum of protons, the most abundant component of the cosmic radiation, is necessary to understand the source and acceleration of cosmic rays in the Milky Way. This work reports the measurement of the cosmic ray proton fluxes with kinetic energies from 40 GeV to 100 TeV, with 2 1/2 years of data recorded by the DArk Matter Particle Explorer (DAMPE). This is the first time that an experiment directly measures the cosmic ray protons up to ~100 TeV with high statistics. The measured spectrum confirms the spectral hardening at ~300 GeV found by previous experiments and reveals a softening at ~13.6 TeV, with the spectral index changing from ~2.60 to ~2.85. Our result suggests the existence of a new spectral feature of cosmic rays at energies lower than the so-called knee and sheds new light on the origin of Galactic cosmic rays.


2004 ◽  
Vol 218 ◽  
pp. 57-64
Author(s):  
Jacco Vink

The two main aspects of supernova remnant research addressed in this review are: I. What is our understanding of the progenitors of the observed remnants, and what have we learned from these remnants about supernova nucleosynthesis? II. Supernova remnants are probably the major source of cosmic rays. What are the recent advances in the observational aspects of cosmic ray acceleration in supernova remnants?


2019 ◽  
Vol 210 ◽  
pp. 02001
Author(s):  
Sergey Ostapchenko

The differences between contemporary Monte Carlo generators of high energy hadronic interactions are discussed and their impact on the interpretation of experimental data on ultra-high energy cosmic rays (UHECRs) is studied. Key directions for further model improvements are outlined. The prospect for a coherent interpretation of the data in terms of the UHECR composition is investigated.


Sign in / Sign up

Export Citation Format

Share Document