- News Home
24 April 2014 11:45 am ,
Vol. 344 ,
- 24 April 2014 11:45 am , Vol. 344 , #6182
- About Us
Robotic Rat to the Rescue
24 January 2012 7:02 pm
What do you get when you combine a monkey's brain with the whiskers of a rat? A robotic rodent that can sense its environment almost as well as the real thing. The new rat-bot could lead to the development of robots that can feel their way through earthquake rubble and could provide clues to how live rats analyze sensory information from their whiskers.
Although recent research has helped scientists understand what information whiskers send to the brains of rodents, deciphering how rats and mice interpret that sensory information has been trickier. Previous models assumed that rodents looked at whisker movement patterns and vibrations over a set duration of time and that their brain made a decision, based on the whole of the data, about the most likely surface the whiskers were touching. If the overall data best matched the known patterns for a hard vinyl floor, for example, the rats would conclude that's the surface that they're on. But different robots created using this model of reasoning were only 50% to 80% accurate at guessing the floor underneath them after 0.4 seconds of exposure, multiple studies have found.
Computational neuroscientist Nathan Lepora of the University of Sheffield in the United Kingdom and his team thought that a model of information processing recently discovered in monkeys might help the robots make better judgments on floor type. The primates don't use a single piece of evidence to make a decision about what they're seeing. Rather, their brains rely on an accumulation of data. When the monkeys watch screens of randomly moving dots, for example, different neurons sense each direction of movement: up, down, left, and right. As dots on the screen flit about, more neurons of each type begin to fire, accumulating a total activity level for the group of neurons. Once, say, the "up" neurons reach a specific threshold, they pass on the message that the dots are moving in that direction.
To test whether this accumulation of evidence model might help robo-rodents better interpret sensory information from their whiskers, Lepora's team attached whiskers to a Roomba, a small, round robotic vacuum that can move independently around rooms. As the Roomba traveled, the attached whiskers brushed against the floor. The researchers collected data on the whiskers' movements and vibrations and analyzed them using different statistical models. The team wanted to find out how well a robot could determine the floor type: either rough carpet, smooth carpet, concrete, or vinyl.
When it relied on the model of processing co-opted from monkeys' visual systems, the robot was nearly perfect at detecting floor type after collecting only 0.2 seconds of whisker information, the team reports online today in the Journal of the Royal Society Interface. That was a dramatic improvement over other methods.
"For a robot to work well, you need it to perceive the world accurately," Lepora explains. "Rats are very good at using whiskers to perceive the world in dark, enclosed environments where maybe vision wouldn't be good. So for a robot in this kind of environment—going through rubble, for example—having similarly functioning whiskers would be a big advantage." A bewhiskered robot might be able to navigate underground or under water, as well as through the rubble left behind by an accident or a natural disaster.
Beyond the practical uses of such a robot, the new findings offer hints into the biology of real rodents, says neuroscientist Andrew Philippides of the University of Sussex in the United Kingdom. By comparing the behavior of real rats that are trained to distinguish surfaces with the conclusions on floor type reached by the robot, scientists could tell whether they were using the same reasoning processes. "You can present the rats and the robot with slightly altered floors that are tricky to figure out," Philippides says. "If they make similar mistakes, then they're most likely using similar models to process the sensory data."