AI co-pilot boosts noninvasive brain-computer interface by interpreting user intent

Using the AI-BCI system, a participant successfully completed the “pick-and-place” task moving four blocks with the assistance of AI and a robotic arm. Credit: Johannes Lee, Jonathan Kao, Neural Engineering and Computation Lab/UCLA

UCLA engineers have developed a wearable, noninvasive brain-computer interface system that utilizes artificial intelligence as a co-pilot to help infer user intent and complete tasks by moving a robotic arm or a computer cursor.

Published in Nature Machine Intelligence, the study shows that the interface demonstrates a new level of performance in noninvasive brain-computer interface, or BCI, systems. This could lead to a range of technologies to help people with limited physical capabilities, such as those with paralysis or neurological conditions, handle and move objects more easily and precisely.

The team developed custom algorithms to decode electroencephalography, or EEG—a method of recording the brain’s electrical activity—and extract signals that reflect movement intentions. They paired the decoded signals with a camera-based artificial intelligence platform that interprets user direction and intent in real time. The system allows individuals to complete tasks significantly faster than without AI assistance.

“By using artificial intelligence to complement brain-computer interface systems, we’re aiming for much less risky and invasive avenues,” said study leader Jonathan Kao, an associate professor of electrical and computer engineering at the UCLA Samueli School of Engineering.

“Ultimately, we want to develop AI-BCI systems that offer shared autonomy, allowing people with movement disorders, such as paralysis or ALS, to regain some independence for everyday tasks.”

State-of-the-art, surgically implanted BCI devices can translate brain signals into commands, but the benefits they currently offer are outweighed by the risks and costs associated with neurosurgery to implant them. More than two decades after they were first demonstrated, such devices are still limited to small pilot clinical trials.

Meanwhile, wearable and other external BCIs have demonstrated a lower level of performance in detecting brain signals reliably.

To address these limitations, the researchers tested their new noninvasive AI-assisted BCI with four participants—three without motor impairments and a fourth who was paralyzed from the waist down.

Participants wore a head cap to record EEG, and the researchers used custom decoder algorithms to translate these brain signals into movements of a computer cursor and robotic arm. Simultaneously, an AI system with a built-in camera observed the decoded movements and helped participants complete two tasks.







A paralyzed participant neurally controls a robotic arm to pick and place blocks. The neural control is non-invasive (no surgery) and is assisted by AI. Credit: Johannes Lee, Jonathan Kao, Neural Engineering and Computation Lab/UCLA

In the first task, they were instructed to move a cursor on a computer screen to hit eight targets, holding the cursor in place at each for at least half a second. In the second challenge, participants were asked to activate a robotic arm to move four blocks on a table from their original spots to designated positions.

All participants completed both tasks significantly faster with AI assistance. Notably, the paralyzed participant completed the robotic arm task in about six-and-a-half minutes with AI assistance, whereas without it, he was unable to complete the task.

The BCI deciphered electrical brain signals that encoded the participants’ intended actions. Using a computer vision system, the custom-built AI inferred the users’ intent—not their eye movements—to guide the cursor and position the blocks.

“Next steps for AI-BCI systems could include the development of more advanced co-pilots that move robotic arms with more speed and precision, and offer a deft touch that adapts to the object the user wants to grasp,” said co-lead author Johannes Lee, a UCLA electrical and computer engineering doctoral candidate advised by Kao.

“And adding in larger-scale training data could also help the AI collaborate on more complex tasks, as well as improve EEG decoding itself.”

The paper’s authors are all members of Kao’s Neural Engineering and Computation Lab. A member of the UCLA Brain Research Institute, Kao also holds faculty appointments in the Computer Science Department and the Interdepartmental Ph.D. Program in Neuroscience.

More information:
Brain–computer interface control with artificial intelligence copilots, Nature Machine Intelligence (2025). DOI: 10.1038/s42256-025-01090-y

Provided by
University of California, Los Angeles


Citation:
AI co-pilot boosts noninvasive brain-computer interface by interpreting user intent (2025, September 1)
retrieved 1 September 2025
from https://medicalxpress.com/news/2025-09-ai-boosts-noninvasive-brain-interface.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.