Incorporating neuronal fatigue in deep neural networks captures dynamics of adaptation in neurophysiology and perception
AbstractAdaptation is a fundamental property of the visual system that molds how an object is processed and perceived in its temporal context. It is unknown whether adaptation requires a circuit level implementation or whether it emerges from neuronally intrinsic biophysical processes. Here we combined neurophysiological recordings, psychophysics, and deep convolutional neural network computational models to test the hypothesis that a neuronally intrinsic, biophysically plausible, fatigue mechanism is sufficient to account for the hallmark properties of adaptation. The proposed model captured neural signatures of adaptation including repetition suppression and novelty detection. At the behavioral level, the proposed model was consistent with perceptual aftereffects. Furthermore, adapting to prevailing but irrelevant inputs improves object recognition and the adaptation computations can be trained in a network trained to maximize recognition performance. These results show that an intrinsic fatigue mechanism can account for key neurophysiological and perceptual properties and enhance visual processing by incorporating temporal context.