Mice have been taught to guide a computer mouse with brain activity

Researchers at University College London, Sainsbury Wellcome Center, have designed a brain-machine interface (BMI) that allows mice to learn to move the cursor on a target using only their brain activity.

The study sheds light on how the brain represents cause-and-control objects. When the mouse operated the cursor, the brain activity in the higher visual cortex was goal-oriented and contained information about the animal’s intent.

This finding could eventually be applied to improve BMI design. Although BMIs have been in development for decades, they still have limitations and are often invasive or time consuming to learn.

“Brain machine interfaces are devices that allow a person or animal to control a computer with its mind,” said Dr. Kelly Clancy, the first author of the book Neuron study. “In humans, it could be controlling a robotic arm to pick up a cup of water or moving the cursor on a computer to type a message with the mind. In animals, we use these devices as models to understand how to improve BMI. “

The director of the Sainsbury Welcome Center, Professor Tom Mrsic-Flogel, added: “Currently, BMIs are usually difficult to use by humans and take a long time to learn how to operate a robotic arm, for example.”

“Once we understand the neural circuits that support how to learn intentional control, which this paper is beginning to shed light on, we hope to make it easier for people to use BMI.”

Studying how causally controlled objects are represented in the brain has proven challenging, especially because of the difficulty in distinguishing between active control and passive observation. In BMI, the subject does not move physically (no motor signals), which means that a clearer comparison can be made.

In this study, the researchers used a wide field of brain imaging. This technique allowed them to look at the entire dorsal surface of the cortex while the animal was using the device – learning to control the cursor to the target to get a reward and locate the areas involved in the process. They found that visual cortical areas, including the parietal cortex, were affected: an area of ​​the brain associated with intent in humans.

“Researchers have been studying the parietal cortex in humans for a long time. However, we did not necessarily expect this area to pop up on our unbiased mouse brain screen, ”Clancy said. “There seems to be something special in the parietal cortex that is located between the sensory and motor areas in the brain and can act as a passing cell between them.”

The learning task in this mouse model — mapping their brain activity into sensory feedback — is analogous to the way humans learn to interact with the world. Our brains build notions about the behavior of objects and perform actions accordingly. Understanding how these rules are generated and updated in the brain could contribute to the development of human BMI.

In 2018, a neurotechnology platform that uses machine learning to translate brain activity into control signals was named Innovation of the Year at the IET Innovation Awards. ‘NeuroConcise’ technology can be built in and hidden in standard headgear, providing the user with a high degree of interaction without physical movement.

Sign up for the E&T news email to receive great stories like this in your inbox every day.