Coupling of saccade plans to endogenous attention during urgent choices

Elife. 2024 Nov 4:13:RP97883. doi: 10.7554/eLife.97883.

Abstract

The neural mechanisms that willfully direct attention to specific locations in space are closely related to those for generating targeting eye movements (saccades). However, the degree to which the voluntary deployment of attention to a location necessarily activates a corresponding saccade plan remains unclear. One problem is that attention and saccades are both automatically driven by salient sensory events; another is that the underlying processes unfold within tens of milliseconds only. Here, we use an urgent task design to resolve the evolution of a visuomotor choice on a moment-by-moment basis while independently controlling the endogenous (goal-driven) and exogenous (salience-driven) contributions to performance. Human participants saw a peripheral cue and, depending on its color, either looked at it (prosaccade) or looked at a diametrically opposite, uninformative non-cue (antisaccade). By varying the luminance of the stimuli, the exogenous contributions could be cleanly dissociated from the endogenous process guiding the choice over time. According to the measured time courses, generating a correct antisaccade requires about 30 ms more processing time than generating a correct prosaccade based on the same perceptual signal. The results indicate that saccade plans elaborated during fixation are biased toward the location where attention is endogenously deployed, but the coupling is weak and can be willfully overridden very rapidly.

Keywords: antisaccades; capture; decision making; human; mental chronometry; neuroscience; salience; visual attention.

Plain language summary

You are attending a talk at a conference, eyes straight ahead and fixed on the speaker… yet you may in fact also be covertly monitoring your phone, hoping for a long-awaited message to flash on the screen. This ability to focus on something without directly looking at it is called spatial attention. It plays an essential role in everyday tasks, such as spotting keys on a cluttered desk or noticing when a traffic light changes. Overlapping brain circuits control spatial attention and eye movements, creating tight links between the two processes. For example, shifting your gaze towards a specific location automatically leads you to pay at least partial attention to what unfolds at this spot. Whether the reverse is true, however, is less clear. In other words: when we are paying attention to something without looking at it, is our brain set to move our eyes towards this location? To explore this question, Goldstein et al. designed a visual task that allowed them to track human participants’ attention and eye movements moment by moment, and to unpick various factors affecting these processes. The volunteers fixed their gaze on the center of a screen, knowing that they also needed to pay attention to a certain location at the periphery where a cue was set to appear. The color of the cue determined whether the participants would then need to shift their gaze either towards or away from it – for example, they were instructed to look directly at a green cue but away from a magenta one. These analyses showed that participants needed about 30 milliseconds less time to program an eye movement toward the cue – that is, to shift their gaze towards the location that they were already covertly monitoring. Such difference in processing time suggests that eye movements are biased towards the location on which attention is directed, but that this preference can still be overridden quickly. By refining our understanding of the mechanisms underpinning attention, the findings by Goldstein et al. may help us better understand conditions like attention deficit hyperactivity disorder, where the brain struggles to engage and disengage with stimuli effectively.

MeSH terms

  • Adult
  • Attention* / physiology
  • Choice Behavior* / physiology
  • Cues
  • Female
  • Humans
  • Male
  • Photic Stimulation
  • Saccades* / physiology
  • Visual Perception / physiology
  • Young Adult