Summary: A new study shows that our brain’s attention system first prepares broadly, then zooms in on specific details within fractions of a second. Using EEG and machine learning, researchers tracked how people focused on either the color or movement of dots before they appeared.
They found that general features were registered in about 240 milliseconds, while specific details took closer to 400 milliseconds. This layered process reveals how the brain organizes attention and could shed light on disorders where attention is disrupted.
Key Facts
- Fast Sequence: Brain detects general categories first, then specifics within milliseconds.
- EEG + AI: Machine learning separated brain activity for general vs. specific features.
- Clinical Potential: Could inform understanding of ADHD, autism, and other attention disorders.
Source: UC Davis
How we focus our attention before we even see an object matters. For example, when we look for something moving in the sky, our expectation would be very different if the object is a bird flying past or a baseball coming straight at us.
But it’s unclear whether our brain’s attention focuses first on a broad characteristic of the anticipated object, such as movement, or a specific feature — such as the direction of movement up or down.
News
Researchers from the Center for Mind and Brain at the University of California, Davis, addressed this by analyzing electrical brain activity with machine-learning methods while human volunteers prepared to see colored dots moving on a screen. The study found that the brain’s attention focus starts with a broad category, then narrows down to the specific feature of interest.
The study was published Aug. 19 in The Journal of Neuroscience.
“Our study tells us that our brains first prepare to focus attention by activating neurons representing the broad category of the anticipated object and then quickly sharpens that focus,” said George R. Mangun, a Distinguished Professor of psychology and neurology and co-director of the UC Davis Center for Mind and Brain.
“This means that the brain’s attention mechanisms are organized in a hierarchy such that it prepares for perceiving a stimulus by narrowing the focus of our attention over time.”
Clocking brain activity
Researchers combined electroencephalogram, or EEG, data with eye tracking and machine learning to study “anticipatory attention,” which is attention that enables a person to prepare to perceive upcoming sensory events. The EEG data reflects the brain’s electrical activity down to the millisecond using electrodes worn on the scalp.
The study took place in 2024 using 25 participants between 19 and 39 years of age.
The research team measured how long it took the brain to get ready to pay attention to colored dots moving on the screen. The goal was to learn whether the brain’s attention first prepared for a broad feature category of an object of interest, such as color or movement, before attention could be narrowed to a specific feature, such as a specific color or direction of movement.
The timer began with a blank screen when researchers cued participants to look for only a blue or green dot, or for a dot moving up or down. The timer stopped when the dots appeared.
“When attention is directed to the color of the moving dots, it suppresses attention to the direction of motion, and vice versa,” said Sreenivasan Meyyappan, an assistant project scientist at the Center for Mind and Brain and the study’s lead author.
“This broad focus of attention is then narrowed further to suppress the irrelevant colors as well, supporting processing of the specific color or motion of interest.”
The machine-learning software separated brain activity for each of the general and specific features. It revealed a difference measured in milliseconds. A millisecond is a thousandth of a second.
Anticipatory attention to the dot’s general category — color or direction of movement— took 240 milliseconds on average to establish in the brain. Attention to the dot’s specific feature — blue or green, or up and down — took longer, with an average time of 400 milliseconds.
“The control systems involved in attention are broadly tuning the brain first, and then narrowing it down,” said Mangun. “It’s like a pilot flying a plane toward Europe and then toward the end zooming in on Rotterdam and not Berlin.”
Building a more complete picture of how the brain works can provide important insights related to brain health, said Mangun. For example, future research might find that people with disordered attention, such as those with attention-deficit hyperactivity disorder or autism, experience delays in narrowing the focus of attention.
“Understanding more about how the brain focuses its attention would tell us what parts of the system are not operating properly and might lead to different perceptual or behavioral symptoms down the line, and therefore different treatment approaches” said Mangun.
Mingzhou Ding, Distinguished Professor and J. Crayton Pruitt Family Professor of biomedical engineering at the University of Florida, is an additional co-author on this study.
Funding: The research was supported by National Institutes of Health and the National Science Foundation.
About this attention and visual neuroscience research news
Author: Karen Nikos
Source: UC Davis
Contact: Karen Nikos – UC Davis
Image: The image is credited to Neuroscience News
Original Research: Closed access.
“Hierarchical Organization of Human Visual Feature Attention Control” by George R. Mangun et al. Journal of Neuroscience
Abstract
Hierarchical Organization of Human Visual Feature Attention Control
Attention can be deployed in advance of visual stimuli based on features such as color or direction of motion.
This anticipatory feature-based attention involves top-down neural control signals from the frontoparietal network that bias visual cortex to enhance attended information and suppress distraction. For example, anticipatory attention control can enable effective selection based on stimulus color while ignoring distracting information about stimulus motion.
Anticipatory attention can also be focused more narrowly, for example, to select specific colors or motion directions that define task-relevant aspects of the stimuli. One important question that remains open is whether anticipatory attention control first biases broad feature dimensions such as color versus motion before biasing the specific feature attributes (e.g., blue vs. green).
To investigate this, we recorded EEG activity during a task where human participants of either sex were cued to either attend to a motion direction (up or down) or a color (blue or green) on a trial-by-trial basis.
Applying multivariate decoding approaches to the EEG alpha band activity (8-12 Hz) during attention control (cue-target interval), we observed significant decoding for both the attended dimensions (motion vs color) and specific feature attributes (up vs. down; blue vs. green). Importantly, the temporal onset of the dimension-level biasing (motion vs. color) preceded that of the attribute-level biasing (up vs. down as well as blue vs. green).
These findings demonstrate that the top-down control of feature-based attention proceeds in a hierarchical fashion, first biasing the broad feature dimension, and then narrowing to the specific feature attribute.