Connect with us

Tech

Brains can tune their navigation system without landmarks

Published

on

Brains can tune their navigation system without landmarks

Johns Hopkins research sheds new light on how mammals track their position and orientation while moving, revealing that visual motion cues alone allow the brain to adjust and recalibrate its internal map even in the absence of stable visual landmarks. Their results appear in Nature Neuroscience.

“When you move through space, you have a lot of competing sensory information telling you where you are and how fast you are going, and your brain has to make sense of that,” said study co-leader Noah Cowan, professor of mechanical engineering at the Whiting School of Engineering and director of the Locomotion in Mechanical and Biological Systems (LIMBS) Laboratory. “Our study results demonstrate that, surprisingly, the brain can perform this continuous recalibration without having obvious external landmarks to tell us our position. The brain can adjust its internal sense of speed through its spatial map from clues solely from optic flow: the visual motion patterns that individuals perceive as they move through space.”

“When you move through space, you have a lot of competing sensory information telling you where you are and how fast you are going, and your brain has to make sense of that.”

Noah Cowan

Director, LIMBS Laboratory

Cowan collaborated on the project with James Knierim, professor of neuroscience at the Krieger School of Arts and Sciences’ Zanvyl Krieger Mind/Brain Institute and the Kavli Neuroscience Discovery Institute at Johns Hopkins.

The researchers knew that, for example, when an individual walks through a tunnel covered in markings, their brain detects the speed at which the markings appear to move past, helping them estimate the distance traveled and their relative position in space. They set out to determine if changing the speed of the markings passing the walker or removing the markers would significantly affect the brain’s response.

“We wanted to get at the mechanisms of how our brain computes ‘distance traveled’ from only velocity information,” Cowan said. “Neurons in our hippocampus ‘light up’ like the blue GPS dot on your phone. We hypothesized that the relationship between optic flow and ‘updating the blue dot’ could be recalibrated in VR—and we found that it can.”

Knierim explained that the researchers aimed to determine whether they could reliably control a laboratory rat’s sense of location on its cognitive map by artificially changing the amount of optic flow it received in a virtual reality system.

“We found that, using control theory principles, we could control the cognitive map precisely using optic flow cues alone, thus demonstrating that this long-hypothesized input really was used by the rat’s path integration system,” Knierim said.

The team constructed a virtual reality dome and projected illuminated stripes on its walls. Rats were enticed by drops of chocolate milk to walk around the dome. The stripes were intended to serve as a subconscious clue to the rodents’ speed and their general location in space. When the team set the stripes to rotate in the opposite direction of the rats when they took a step, the animals’ hippocampal response indicated that they thought they were moving twice as fast, and their sense of location was skewed. After a period of time, when the stripes were turned off, the researchers discovered that the rats still perceived themselves as moving faster than they actually were.

Cowan said that it is already known that mammals’ brains use landmarks’ positions relative to each other to determine location and calibrate approximate speed. What was not known was if a mammal’s brain would recalibrate its speed through its mental map in the absence of any landmarks.

“How your brain does that recalibration in the absence of landmarks, and the fact that it does that at all, was not previously known, and we show that in this research,” he said.

The study results provide valuable insight into two key areas. First, they shed light on the functioning of the mammalian hippocampus, a brain region involved in Alzheimer’s disease and other dementia, and second, the research answers a long-standing question about the basic biology of how animals navigate in the world.

“Because the navigation system is so intimately tied to the brain’s memory system, we hope that understanding how it creates these cognitive maps will provide insight into how memory becomes weaker during aging and during dementia,” Knierim said.

But the results also have implications for robotics. Cowan noted that the finding could also inform the development of AI and machine learning algorithms designed to integrate visual information with representations of space, ultimately paving the way for embodied cognitive systems.

Study co-lead authors were Manu Madhav, now at the University of British Columbia, and Ravi Jayakumar, a postdoc in the Knierim and Cowan laboratories.

Continue Reading