Detection Performance of Or-ing Device with Pre- and Post-Averaging: Part I - Random Signal

1999 ◽  
Author(s):  
Albert H. Nuttall
2020 ◽  
Vol 2020 (48) ◽  
pp. 17-24
Author(s):  
I.M. Javorskyj ◽  
◽  
R.M. Yuzefovych ◽  
P.R. Kurapov ◽  
◽  
...  

The correlation and spectral properties of a multicomponent narrowband periodical non-stationary random signal (PNRS) and its Hilbert transformation are considered. It is shown that multicomponent narrowband PNRS differ from the monocomponent signal. This difference is caused by correlation of the quadratures for the different carrier harmonics. Such features of the analytic signal must be taken into account when we use the Hilbert transform for the analysis of real time series.


Author(s):  
Inna D. Ponomareva ◽  
Alexey V. Karpenko ◽  
Gennadiy V. Tsepkov
Keyword(s):  

1999 ◽  
Vol 53 (9-10) ◽  
pp. 1-10
Author(s):  
V. A. Omel'chenko ◽  
V. V. Balabanov ◽  
B. M. Bezruk ◽  
Yu. N. Goloborod'ko

2020 ◽  
Vol 2020 (10) ◽  
pp. 310-1-310-7
Author(s):  
Khalid Omer ◽  
Luca Caucci ◽  
Meredith Kupinski

This work reports on convolutional neural network (CNN) performance on an image texture classification task as a function of linear image processing and number of training images. Detection performance of single and multi-layer CNNs (sCNN/mCNN) are compared to optimal observers. Performance is quantified by the area under the receiver operating characteristic (ROC) curve, also known as the AUC. For perfect detection AUC = 1.0 and AUC = 0.5 for guessing. The Ideal Observer (IO) maximizes AUC but is prohibitive in practice because it depends on high-dimensional image likelihoods. The IO performance is invariant to any fullrank, invertible linear image processing. This work demonstrates the existence of full-rank, invertible linear transforms that can degrade both sCNN and mCNN even in the limit of large quantities of training data. A subsequent invertible linear transform changes the images’ correlation structure again and can improve this AUC. Stationary textures sampled from zero mean and unequal covariance Gaussian distributions allow closed-form analytic expressions for the IO and optimal linear compression. Linear compression is a mitigation technique for high-dimension low sample size (HDLSS) applications. By definition, compression strictly decreases or maintains IO detection performance. For small quantities of training data, linear image compression prior to the sCNN architecture can increase AUC from 0.56 to 0.93. Results indicate an optimal compression ratio for CNN based on task difficulty, compression method, and number of training images.


2009 ◽  
Vol 2128 (1) ◽  
pp. 161-172 ◽  
Author(s):  
Dan Middleton ◽  
Ryan Longmire ◽  
Darcy M. Bullock ◽  
James R. Sturdevant

2014 ◽  
Vol 35 (12) ◽  
pp. 2795-2801
Author(s):  
Jun You ◽  
Xian-rong Wan ◽  
Zi-ping Gong ◽  
Feng Cheng ◽  
Heng-yu Ke

1985 ◽  
Vol 50 (11) ◽  
pp. 2545-2557
Author(s):  
Pavel Hasal ◽  
Vladimír Kudrna ◽  
Jitka Vyhlídková

The paper is focused on a theoretical analysis of the function of continuous flow mixer with the so-called gamma-distribution of fluid residence times, used as a linear filter smoothing undesirable fluctuations of input properties. A relation is derived expressing the degree of smoothing of the signal passing through the system, as a function of statistical parameters of this signal and of gamma-distribution of fluid-residence times in the mixer. The analysis of this relation leads to conclusions concerning the prediction of the operation of smoothing mixers or the design of their basic parameters.


Sign in / Sign up

Export Citation Format

Share Document