- News Home
17 April 2014 12:48 pm ,
Vol. 344 ,
Officials last week revealed that the U.S. contribution to ITER could cost $3.9 billion by 2034—roughly four times the...
An experimental hepatitis B drug that looked safe in animal trials tragically killed five of 15 patients in 1993. Now,...
Using the two high-quality genomes that exist for Neandertals and Denisovans, researchers find clues to gene activity...
A new report from the Intergovernmental Panel on Climate Change (IPCC) concludes that humanity has done little to slow...
Astronomers have discovered an Earth-sized planet in the habitable zone of a red dwarf—a star cooler than the sun—500...
Three years ago, Jennifer Francis of Rutgers University proposed that a warming Arctic was altering the behavior of the...
- 17 April 2014 12:48 pm , Vol. 344 , #6181
- About Us
Deciphering the Brain's Autofocus Mechanism
7 October 2011 3:58 pm
It's something we all take for granted: our ability to look at an object, near or far, and bring it instantly into focus. The eyes of humans and many animals do this almost instantaneously and with stunning accuracy. Now researchers say they are one step closer to understanding how the brain accomplishes this feat.
Wilson Geisler and Johannes Burge, psychologists at the Center for Perceptual Systems at the University of Texas, Austin, have developed a simple algorithm for quickly and accurately estimating the focus error from a single blurry image-something they say is key to understanding how biological visual systems avoid the repetitive guess-and-check method employed by digital cameras. The discovery may advance our understanding of how nearsightedness develops in humans or help engineers improve digital cameras, the researchers say.
In order to see an object clearly, an accurate estimate of blur is important. Humans and animals instinctively extract key features from a blurry image, use that information to determine their distance from an object, then instantly focus the eye to the precise desired focal length, Geisler explains. "In some animals, that's the primary way they sense distance," he says. For example, the chameleon relies on this method to pinpoint the location of a flying insect and snap its tongue to that exact spot. Altering the amount of blur by placing a lens in front of its eye causes the chameleon to misjudge the distance in a predictable way.
But scientists didn't know how biological visual systems estimate blur so well. Many researchers had thought the brain used a system of guessing and checking to get to the answer, much like the way a camera's auto-focus system works. Basically, the camera changes the focal distance, measures the contrast in the image it sees, and repeats the process until it has maximized the contrast, Burge says.
"This search procedure is slow, often begins its search in the wrong direction, and relies on the assumption that maximum contrast equals best focus—which is not strictly true," Burge says.
In an attempt to resolve the question of how humans and animals might use blur to accurately estimate distance, Geisler and Burge used well-known mathematical equations to create a computer simulation of the human visual system. They presented the computer with digital images of natural scenes similar to what a person might see, such as faces, flowers, or scenery, and observed that although the content of these images varied widely, many features of the images—patterns of sharpness and blurriness and relative amounts of detail—remained the same.
The duo then attempted to mimic how the human visual system might be processing these images by adding a set of filters to their model designed to detect these features. When they blurred the images by systematically changing the focus error in the computer simulation and tested the response of the filters, the researchers found that they could predict the exact amount of focus error by the pattern of response they observed in the feature detectors. The researchers say this provides a potential explanation for how the brains of humans and animals can quickly and accurately determine focus error without guessing and checking. Their research appears online this week in the Proceedings of the National Academy of Sciences.
"They've provided proof that there is enough information in a static image to determine if an object is too close or too far away," says Larry Thibos, a professor of optometry and vision researcher at Indiana University, Bloomington. "We've known for 50 or 60 years that people are very good at knowing whether or not something is in focus. It's taken this paper to show us how the visual system might accomplish this feat."
The researchers also added common visual imperfections to their simulations and found that when it comes to judging focus, flaws are actually a good thing.
"What we discovered is that the imperfections in the eye—things like astigmatism and chromatic aberration—actually help it to focus," Geisler explains. That may help explain why people who have had their astigmatism corrected through laser eye surgery often have trouble focusing for several weeks afterward, Geisler says.
That sort of understanding may have an impact on medical decisions, Thibos says. "People might be tempted to try and perfect nature," he says, "when maybe it's better to be a little bit imperfect."