scholarly journals A model grounded in natural scene statistics predicts human performance with both natural and artificial stimuli

2018 ◽  
Vol 18 (10) ◽  
pp. 340
Author(s):  
Benjamin Chin ◽  
Johannes Burge
2019 ◽  
Author(s):  
Seha Kim ◽  
Johannes Burge

ABSTRACTVisual systems estimate the three-dimensional (3D) structure of scenes from information in two-dimensional (2D) retinal images. Visual systems use multiple sources of information to improve the accuracy of these estimates, including statistical knowledge of the probable spatial arrangements of natural scenes. Here, we examine how 3D surface tilts are spatially related in real-world scenes, and show that humans pool information across space when estimating surface tilt in accordance with these spatial relationships. We develop a hierarchical model of surface tilt estimation that is grounded in the statistics of tilt in natural scenes and images. The model computes a global tilt estimate by pooling local tilt estimates within an adaptive spatial neighborhood. The spatial neighborhood in which local estimates are pooled changes according to the value of the local estimate at a target location. The hierarchical model provides more accurate estimates of groundtruth tilt in natural scenes and provides a better account of human performance than the local model. Taken together, the results imply that the human visual system pools information about surface tilt across space in accordance with natural scene statistics.


2014 ◽  
Vol 23 (1) ◽  
pp. 450-465 ◽  
Author(s):  
Kwanghyun Lee ◽  
Anush Krishna Moorthy ◽  
Sanghoon Lee ◽  
Alan Conrad Bovik

2009 ◽  
Vol 26 (1) ◽  
pp. 35-49 ◽  
Author(s):  
THORSTEN HANSEN ◽  
KARL R. GEGENFURTNER

AbstractForm vision is traditionally regarded as processing primarily achromatic information. Previous investigations into the statistics of color and luminance in natural scenes have claimed that luminance and chromatic edges are not independent of each other and that any chromatic edge most likely occurs together with a luminance edge of similar strength. Here we computed the joint statistics of luminance and chromatic edges in over 700 calibrated color images from natural scenes. We found that isoluminant edges exist in natural scenes and were not rarer than pure luminance edges. Most edges combined luminance and chromatic information but to varying degrees such that luminance and chromatic edges were statistically independent of each other. Independence increased along successive stages of visual processing from cones via postreceptoral color-opponent channels to edges. The results show that chromatic edge contrast is an independent source of information that can be linearly combined with other cues for the proper segmentation of objects in natural and artificial vision systems. Color vision may have evolved in response to the natural scene statistics to gain access to this independent information.


Quality estimation in images is an area which demands high attention of researchers. Many recent algorithms in Image quality assessment relies on the computation of definite values from the image or comparison with the original pristine image. Here, we propose the extraction of a set of specific features from image and processing is done on these extracted features to obtain the objective quality score. The detailed inspection of behaviour of this set of highly specific image features extracted through less complex mathematical procedure from a collection good quality and low quality set of Natural Scene Statistics images available in LIVE dataset is elaborated in this work. Our studies and results are compared with the subjective opinion value and is proven to be accurate. The obtained results are demonstrated using statistical and graphical manner for promptness in understanding the nature of quality of the image. Thus the proposed feature set is proven to be complete in assessing the quantitative quality value of any Natural image.


2015 ◽  
Vol 15 (12) ◽  
pp. 1287
Author(s):  
Emily Cooper ◽  
Anthony Norcia

Sign in / Sign up

Export Citation Format

Share Document