- News Home
6 March 2014 1:04 pm ,
Vol. 343 ,
Considered an icon of conservation science, researchers at World Wildlife Fund (WWF) headquarters in Washington, D.C.,...
The new atlas, which shows the distribution of important trace metals and other substances, is the first product of...
Early in April, the first of a fleet of environmental monitoring satellites will lift off from Europe's spaceport in...
Since 2000, U.S. government health research agencies have spent almost $1 billion on an effort to churn out thousands...
Magdalena Koziol, a former postdoc at Yale University, was the victim of scientific sabotage. Now, she is suing the...
Antiretroviral drugs can protect people from becoming infected by HIV. But so-called pre-exposure prophylaxis, or PrEP...
Two studies show that eating a diet low in protein and high in carbohydrates is linked to a longer, healthier life, and...
- 6 March 2014 1:04 pm , Vol. 343 , #6175
- About Us
Deciphering the Brain's Autofocus Mechanism
7 October 2011 3:58 pm
It's something we all take for granted: our ability to look at an object, near or far, and bring it instantly into focus. The eyes of humans and many animals do this almost instantaneously and with stunning accuracy. Now researchers say they are one step closer to understanding how the brain accomplishes this feat.
Wilson Geisler and Johannes Burge, psychologists at the Center for Perceptual Systems at the University of Texas, Austin, have developed a simple algorithm for quickly and accurately estimating the focus error from a single blurry image-something they say is key to understanding how biological visual systems avoid the repetitive guess-and-check method employed by digital cameras. The discovery may advance our understanding of how nearsightedness develops in humans or help engineers improve digital cameras, the researchers say.
In order to see an object clearly, an accurate estimate of blur is important. Humans and animals instinctively extract key features from a blurry image, use that information to determine their distance from an object, then instantly focus the eye to the precise desired focal length, Geisler explains. "In some animals, that's the primary way they sense distance," he says. For example, the chameleon relies on this method to pinpoint the location of a flying insect and snap its tongue to that exact spot. Altering the amount of blur by placing a lens in front of its eye causes the chameleon to misjudge the distance in a predictable way.
But scientists didn't know how biological visual systems estimate blur so well. Many researchers had thought the brain used a system of guessing and checking to get to the answer, much like the way a camera's auto-focus system works. Basically, the camera changes the focal distance, measures the contrast in the image it sees, and repeats the process until it has maximized the contrast, Burge says.
"This search procedure is slow, often begins its search in the wrong direction, and relies on the assumption that maximum contrast equals best focus—which is not strictly true," Burge says.
In an attempt to resolve the question of how humans and animals might use blur to accurately estimate distance, Geisler and Burge used well-known mathematical equations to create a computer simulation of the human visual system. They presented the computer with digital images of natural scenes similar to what a person might see, such as faces, flowers, or scenery, and observed that although the content of these images varied widely, many features of the images—patterns of sharpness and blurriness and relative amounts of detail—remained the same.
The duo then attempted to mimic how the human visual system might be processing these images by adding a set of filters to their model designed to detect these features. When they blurred the images by systematically changing the focus error in the computer simulation and tested the response of the filters, the researchers found that they could predict the exact amount of focus error by the pattern of response they observed in the feature detectors. The researchers say this provides a potential explanation for how the brains of humans and animals can quickly and accurately determine focus error without guessing and checking. Their research appears online this week in the Proceedings of the National Academy of Sciences.
"They've provided proof that there is enough information in a static image to determine if an object is too close or too far away," says Larry Thibos, a professor of optometry and vision researcher at Indiana University, Bloomington. "We've known for 50 or 60 years that people are very good at knowing whether or not something is in focus. It's taken this paper to show us how the visual system might accomplish this feat."
The researchers also added common visual imperfections to their simulations and found that when it comes to judging focus, flaws are actually a good thing.
"What we discovered is that the imperfections in the eye—things like astigmatism and chromatic aberration—actually help it to focus," Geisler explains. That may help explain why people who have had their astigmatism corrected through laser eye surgery often have trouble focusing for several weeks afterward, Geisler says.
That sort of understanding may have an impact on medical decisions, Thibos says. "People might be tempted to try and perfect nature," he says, "when maybe it's better to be a little bit imperfect."