scholarly journals Image-quality degradation and retrieval errors introduced by registration and interpolation of multispectral digital images

1996 ◽  
Author(s):  
Bradley G. Henderson ◽  
Christoph C. Borel ◽  
James P. Theiler ◽  
Barham W. Smith
2020 ◽  
Vol 30 (1) ◽  
pp. 240-257
Author(s):  
Akula Suneetha ◽  
E. Srinivasa Reddy

Abstract In the data collection phase, the digital images are captured using sensors that often contaminated by noise (undesired random signal). In digital image processing task, enhancing the image quality and reducing the noise is a central process. Image denoising effectively preserves the image edges to a higher extend in the flat regions. Several adaptive filters (median filter, Gaussian filter, fuzzy filter, etc.) have been utilized to improve the smoothness of digital image, but these filters failed to preserve the image edges while removing noise. In this paper, a modified fuzzy set filter has been proposed to eliminate noise for restoring the digital image. Usually in fuzzy set filter, sixteen fuzzy rules are generated to find the noisy pixels in the digital image. In modified fuzzy set filter, a set of twenty-four fuzzy rules are generated with additional four pixel locations for determining the noisy pixels in the digital image. The additional eight fuzzy rules ease the process of finding the image pixels,whether it required averaging or not. In this scenario, the input digital images were collected from the underwater photography fish dataset. The efficiency of the modified fuzzy set filter was evaluated by varying degrees of Gaussian noise (0.01, 0.03, and 0.1 levels of Gaussian noise). For performance evaluation, Structural Similarity (SSIM), Mean Structural Similarity (MSSIM), Mean Square Error (MSE), Normalized Mean Square Error (NMSE), Universal Image Quality Index (UIQI), Peak Signal to Noise Ratio (PSNR), and Visual Information Fidelity (VIF) were used. The experimental results showed that the modified fuzzy set filter improved PSNR value up to 2-3 dB, MSSIM up to 0.12-0.03, and NMSE value up to 0.38-0.1 compared to the traditional filtering techniques.


2021 ◽  
Vol 7 (7) ◽  
pp. 112
Author(s):  
Domonkos Varga

The goal of no-reference image quality assessment (NR-IQA) is to evaluate their perceptual quality of digital images without using the distortion-free, pristine counterparts. NR-IQA is an important part of multimedia signal processing since digital images can undergo a wide variety of distortions during storage, compression, and transmission. In this paper, we propose a novel architecture that extracts deep features from the input image at multiple scales to improve the effectiveness of feature extraction for NR-IQA using convolutional neural networks. Specifically, the proposed method extracts deep activations for local patches at multiple scales and maps them onto perceptual quality scores with the help of trained Gaussian process regressors. Extensive experiments demonstrate that the introduced algorithm performs favorably against the state-of-the-art methods on three large benchmark datasets with authentic distortions (LIVE In the Wild, KonIQ-10k, and SPAQ).


2019 ◽  
Author(s):  
Sabrina Asteriti ◽  
Valeria Ricci ◽  
Lorenzo Cangiano

ABSTRACTTissue clearing techniques are undergoing a renaissance motivated by the need to image fluorescence deep in biological samples without physical sectioning. Optical transparency is achieved by equilibrating tissues with high refractive index (RI) solutions, which require expensive optimized objectives to avoid aberrations. One may thus need to assess whether an available objective is suitable for a specific clearing solution, or the impact on imaging of small mismatches between cleared sample and objective design RIs. We derived closed form approximations for image quality degradation versus RI mismatch and other parameters available to the microscopist. We validated them with computed (and experimentally confirmed) aberrated point spread functions, and by imaging fluorescent neurons in high RI solutions. Crucially, we propose two simple numerical criteria to establish: (i) the degradation in image quality (brightness and resolution) from optimal conditions of any clearing solution/objective combination; (ii) which objective, among several, achieves the highest resolution in a given immersion medium. These criteria apply directly to the widefield fluorescent microscope but are also closely relevant to more advanced microscopes.


2018 ◽  
Vol 57 (11) ◽  
pp. 2851 ◽  
Author(s):  
Jueqin Qiu ◽  
Haisong Xu ◽  
Zhengnan Ye ◽  
Changyu Diao

Author(s):  
Giulio Fanti ◽  
Roberto Basso

The problem of exposure-time optimization in digital images acquired by a tripod-camera vibrating system is examined in this paper and an initial analysis is presented. The different noise sources concerning both the acquisition sensor in the camera and external vibrations were studied and quantified in some specific cases. The digital image quality is then discussed in terms of the MTF function evaluated at 50% level in order to define what the optimum ranges of exposure-times are.


2000 ◽  
Vol 29 (1) ◽  
Author(s):  
Robin Dale

The goal of this project report, sponsored by The National Endowment for the Humanities, Division of Preservation and Access, is “to offer some guidance to libraries, archives, and museums in their efforts to convert photographic collections to digital form.” To date, there are no standards for measuring the quality of digital images created from photographs. Therefore, this report is primarily concerned with developing tools to measure image quality. Other technical and managerial issues related to digital imaging projects in general are also addressed.


2000 ◽  
Vol 30 (12) ◽  
pp. 1999-2005 ◽  
Author(s):  
Sylvia R Englund ◽  
Joseph J O'Brien ◽  
David B Clark

This study presents the results of a comparison of digital and film hemispherical photography as means of characterizing forest light environments and canopy openness. We also compared hemispherical photography to spherical densiometry. Our results showed that differences in digital image quality due to the loss of resolution that occurred when images were processed for computer analysis did not affect estimates of unweighted openness. Weighted openness and total site factor estimates were significantly higher in digital images compared with film photos. The differences between the two techniques might be a result of underexposure of the film images or differences in lens optical quality and field of view. We found densiometer measurements significantly increased in consistency with user practice and were correlated with total site factor and weighted-openness estimates derived from hemispherical photography. Digital photography was effective and more convenient and inexpensive than film cameras, but until the differences we observed are better explained, we recommend caution when comparisons are made between the two techniques. We also concluded that spherical densiometers effectively characterize forest light environments.


1983 ◽  
Vol 27 (1) ◽  
pp. 41-45 ◽  
Author(s):  
Robert J. Beaton

Fourteen image quality metrics were evaluated for hard-copy and soft-copy displays of digital images degraded by various levels of noise and blur. All quality metrics were formulated to include the displayed modulation spectrum of the image. The statistical analyses suggested that several of the metrics correlated strongly with performance, and, thus, support the proposed utility of image-dependent quality measures. An MTFA-type metric was shown to correlate highest with the average performance scores across noise and blur conditions.


Sign in / Sign up

Export Citation Format

Share Document