Moon Mystery Solved?

When the moon hits your eye like a big pizza pie, that's illusion ... and a point of contention for scientists. For decades, neuropsychologists have been arguing about what makes the moon look so much bigger when it's near the horizon. In the current issue of the Proceedings of the National Academy of Sciences, a father-son team claims to have torpedoed one of the two major theories to explain this illusion.

When looked at through a telescope--or a towel roll--the moon has the same size whether it's at the horizon or high in the sky. There are two theories to explain why things look quite different to the naked eye. Although both ideas are linked to the moon's size relative to the terrain surrounding an observer, they are irreconcilable.

The first theory assumes that the brain errs in gauging the size of the moon at its zenith because there are no objects or terrain in the central field of view to compare it to. This phenomenon, called "micropsia," leads an observer to conclude that the moon at zenith is farther away than the large moon at the horizon. The other theory reverses cause and effect. Looking at the horizon, it claims, the observer uses distance cues from the terrain, such as the height of trees or the texture of the Earth's surface, to conclude that the moon is farther away than at zenith. As a result, the brain concludes that the "distant" moon at the horizon must be a lot bigger than the "nearby" moon high in the sky.

Reasoning that perceived distances were the key to proving or disproving either theory, Lloyd Kaufman, an emeritus professor of psychology at New York University, and his son James, a physicist at IBM in San Jose, designed a way to measure these distances. They fashioned a computerized display to give five subjects a stereoscopic image of two artificial "moons" that were shown either both close to the horizon or both high in the sky. One of the moons was displayed in a slightly different position when seen by the left versus the right eye. Normally, our brain uses this effect, called parallax, to determine how close objects are. The other moon's image was the same in both eyes, as if it were very far away like the real moon. The subjects were then asked to alter the parallax of the variable moon, changing its apparent distance so that it sat halfway between the viewer and the fixed moon.

When both moons were projected close to the horizon, the participants judged the fixed moon to be four times farther away than when both were in the sky, says the elder Kaufman--a finding that contradicts the micropsia theory and supports the "terrain theory," he says. "My feeling is that it's a decisive experiment."

But Don McCready, emeritus professor of psychology at the University of Wisconsin, Whitewater, is unconvinced. The terrain theory has one weakness that the study doesn't address, he says: Most people don't believe the moon is farther away when it's at the horizon. "Most say larger and closer, while some say larger and the same distance."

Posted in Brain & Behavior