image quality measure
Recently Published Documents


TOTAL DOCUMENTS

71
(FIVE YEARS 0)

H-INDEX

11
(FIVE YEARS 0)

2020 ◽  
Vol 13 (6) ◽  
pp. 460-471
Author(s):  
Ahmed Hashim ◽  
◽  
Hazim Daway ◽  
Hana kareem ◽  
◽  
...  

Haze causes the degradation of image quality. Thus, the quality of the haze must be estimated. In this paper, we introduce a new method for measuring the quality of haze images using a no-reference scale depending on color saturation. We calculate the probability for a saturation component. This work also includes a subjective study for measuring image quality using human perception. The proposed method is compared with other methods as, entropy, Naturalness Image Quality Evaluator (NIQE), Haze Distribution Map based Haze Assessment (HDMHA), and no reference image quality assessment by using Transmission Component Estimation (TCE). This done by calculating the correlation coefficient between non-reference measures and subjective measure, the results show that the proposed method has a high correlation coefficient values for Pearson correlation coefficient (0.8923), Kendall (0.7170), and Spearman correlation coefficient (0.8960). The image database used in this work consists of 70 hazy images captured by using a special device, design to capture haze image. The experiment on haze database is consistent with the subjective experiment.


Author(s):  
Akhilesh Verma ◽  
Vijay Kumar Gupta ◽  
Savita Goel

Background: In recent history, fingerprint presentation attack detection (FPAD) proposal came out in a variety of ways. A close-set approach uses pattern classification technique that best suits to a specific context and goal. Openset approach works fine in wider context, which is relatively robust with new fabrication material and independent of sensor type. In both case results were promising but not too generalizable because of unseen condition not fitting into method used. It is clear, the two key challenges in FPAD system, sensor interoperability and robustness with new fabrication materials not addressed to date. Objective: To address above challenge a liveness detection model is proposed using live sample using transient liveness factor and one-class CNN. Methods: In our architecture, liveness is predicted by using the fusion rule, score level fusion of two decisions. Here, ‘n’ high quality live samples are initially trained for quality. We have observed that fingerprint liveness information is ‘transitory’ in nature, a variation in the different live sample is natural. Thus, each live sample has a ‘transient liveness’ (TL) information. We use no-reference (NR) image quality measure (IQM) as a transient value corresponding to each live sample. A consensus agreement is collectively reached in transient value to predict adversarial input. Further, live sample at server are trained with augmented inputs on the one-class classifier to predict the outlier. So, by using the fusion rule, score level fusion of consensus agreement and appropriately characterized negative cases (or outliers) predicts liveness. Results: Our approach uses high quality 30-live sample only, out of 90 images available in dataset to reduce learning time. We used Time Series images from LivDet competition 2015. It has 90-live images and 45-spoof images made from Bodydouble, Ecoflex and Playdoh of each person. Fusion rule results in 100% accuracy in recognising live as live. Conclusion: We have presented an architecture for liveness-server for extraction/updating transient liveness factor. Our work explained here a significant step forward towards generalized and reproducible process with a consideration towards the provision for the universal scheme as a need of today. The proposed TLF approach has a solid presumption; it will address dataset heterogeneity as it incorporates wider scope-context. Similar results with other dataset are under validation. Implementation seems difficult now but have several advantages when carried out during the transformative process.


2020 ◽  
Vol 167 ◽  
pp. 404-414
Author(s):  
Besma Sadou ◽  
Atidel Lahoulou ◽  
Toufik Bouden

2018 ◽  
Vol 37 (2) ◽  
pp. 105 ◽  
Author(s):  
Kanjar De ◽  
Masilamani V

Over the years image quality assessment is one of the active area of research in image processing. Distortion in images can be caused by various sources like noise, blur, transmission channel errors, compression artifacts etc. Image distortions can occur during the image acquisition process (blur/noise), image compression (ringing and blocking artifacts) or during the transmission process. A single image can be distorted by multiple sources and assessing quality of such images is an extremely challenging task. The human visual system can easily identify image quality in such cases, but for a computer algorithm performing the task of quality assessment is a very difficult. In this paper, we propose a new no-reference image quality assessment for images corrupted by more than one type of distortions. The proposed technique is compared with the best-known framework for image quality assessment for multiply distorted images and standard state of the art Full reference and No-reference image quality assessment techniques available. 


2017 ◽  
Vol 7 (6) ◽  
pp. 2277-2281 ◽  
Author(s):  
H. T. R. Kurmasha ◽  
A. F. H. Alharan ◽  
C. S. Der ◽  
N. H. Azami

An Edge-based image quality measure (IQM) technique for the assessment of histogram equalization (HE)-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE) and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM) and Information Fidelity Criterion-based (IFC) measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC) > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP). Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC) > 0.92, Root Mean Squared Error (RMSE) < 0.1054, and Outlier Ratio (OR) = 0%).


2017 ◽  
Vol 2017 (12) ◽  
pp. 52-58 ◽  
Author(s):  
Arie Shaus ◽  
Shira Faigenbaum-Golovin ◽  
Barak Sober ◽  
Eli Turkel

Sign in / Sign up

Export Citation Format

Share Document