Mechanisms and Meaning of Devries—Rose Adaptation
‘Square-root’ or ‘deVries — Rose’ light adaptation is observed over a substantial luminance range in human foveal vision. The classical interpretation is that a detector (presumably in the brain) discriminates the neural signal evoked by the stimulus from the neural noise evoked by quantum fluctuations. It is known, however, that the retina may adjust its gain in inverse proportion to the square root of mean luminance, as observed eg in cat retinal ganglion cells under scotopic or mesopic adaptation. This kind of gain change is approximated even by the primary visual cells, the rods and cones, in at least some vertebrate species up to luminances producing 103 – 104 photoisomerisations per photoreceptor cell, per second. Is square-root adaptation in fact mainly an expression of an inverse-square-root gain in retinal cells? We investigated the roles of gain and noise in human foveal detection of 0.25 deg incremental spots presented for 50 ms on 5 deg steady backgrounds ranging from −0.25 to 2.35 log td, by measuring effects of pixel noise added to the stimulus and background. The results were consistent with the hypothesis that square-root adaptation mainly reflects gain changes, whereas the signal is detected against a constant level of neural noise. They were not consistent with the idea that signals proportional to stimulus intensity are detected against a noise that increases in proportion to quantum fluctuations. Thus, they do not support a simple interpretation of the deVries — Rose law. Still, an inverse-square-root retinal gain may in an evolutionary sense be seen as an adaptation to quantum fluctuations, in view of its functional consequences: (1) that output noise stays constant, independent of luminance level; (2) that light signals of constant statistical significance are encoded by visual responses of constant size.