Target modality determines eye-head coordination in nonhuman primates: implications for gaze control
We have studied eye-head coordination in nonhuman primates with acoustic targets after finding that they are unable to make accurate saccadic eye movements to targets of this type with the head restrained. Three male macaque monkeys with experience in localizing sounds for rewards by pointing their gaze to the perceived location of sources served as subjects. Visual targets were used as controls. The experimental sessions were configured to minimize the chances that the subject would be able to predict the modality of the target as well as its location and time of presentation. The data show that eye and head movements are coordinated differently to generate gaze shifts to acoustic targets. Chiefly, the head invariably started to move before the eye and contributed more to the gaze shift. These differences were more striking for gaze shifts of <20–25° in amplitude, to which the head contributes very little or not at all when the target is visual. Thus acoustic and visual targets trigger gaze shifts with different eye-head coordination. This, coupled to the fact that anatomic evidence involves the superior colliculus as the link between auditory spatial processing and the motor system, suggests that separate signals are likely generated within this midbrain structure.