You open the overstuffed kitchen cabinet and a drinking glass tumbles out. With a ninjalike reflex, you snatch it before it shatters on the floor, as if the movement of the object were being tracked before the information even reached your brain. According to one idea of how the circuitry of the eye processes visual data, that is literally what happens. Now, a deep anatomical study of a mouse retina—carried out by 120,000 members of the public—is bringing scientists a step closer to confirming the hypothesis.
Researchers have known for decades that the eye does much more than just detect light. The dense patch of neurons in the retina also processes basic features of a scene before sending the information to the brain. For example, in 1964, scientists showed that some neurons in the retina fire up only in response to motion. What's more, these “space-time” detectors have so-called direction selectivity, each one sensitive to objects moving in different directions. But exactly how that processing happens in the retina has remained a mystery.
The stumbling block is a lack of fine-grained anatomical detail about how the neurons in the retina are wired up to each other. Although researchers have imaged the retina microscopically in ultrathin sections, no computer algorithm has been able to accurately trace out the borders of all the neurons to map the circuitry. At this point, only humans have good enough spatial reasoning to figure out what is part of a branching cell and what is just background noise in the images.
Enter the EyeWire project, an online game that recruits volunteers to map out those cellular contours within a mouse’s retina. The game was created and launched in December 2012 by a team led by H. Sebastian Seung, a neuroscientist at the Massachusetts Institute of Technology in Cambridge. Players navigate their way through the retina one 4.5-micrometer tissue block at a time, coloring the branches of neurons along the way. Most of the effort gets done in massive online competitions between players vying to map out the most volume. (Watch a video of a player walking through a tissue block here .) By last week, the 120,000 EyeWire players had completed 2.3 million blocks. That may sound like a lot, but it is less than 2% of the retina.
The sample is already enough to reveal new features, however. The EyeWire map  shows two types of retinal cells with unprecedented resolution. The first, called starburst amacrine cells (SACs), have branches spread out in a flat, plate-shaped array perpendicular to the incoming light. The second, called bipolar cells (BPs), are smaller and bushy. The BPs come in two varieties, one of which reacts to light more slowly than the other—a time delay of about 50 milliseconds. The SACs and BPs are known to be related to direction sensitivity, but exactly how they sense direction remains to be discovered.
Seung says the EyeWire map of how SACs and BPs are wired together holds the answer: a time-delay circuit. Because of the arrangement of BPs, the movement of an object across the surface of a SAC should make it fire up only in reaction to movement in one direction. The key insight is that BPs are not connected to the SAC branches willy-nilly, as was thought. Instead, the faster variety of BP clusters far out on the edges of the SAC, while the slower firing variety clusters close to the SAC center. Only if the light from an object moves from the center of the SAC outward does the signal from the innermost BP sync up with the faster outer BP, and that combined signal is required to activate the SAC. If instead the movement is in the opposite direction, those signals are out of sync and the SAC does not fire. Though it has yet to be confirmed experimentally, this mechanism could account for how the neurons in the retina detect the direction of movement of a moving object long before the information reaches the brain , the team reported online yesterday in Nature.
The study is “truly amazing,” says Alexander Borst, a neuroscientist at the Max Planck Institute of Neurobiology in Martinsried, Germany. Last year, Borst led the effort to map out a similar arrangement of neurons in the eye of a fruit fly. “This mechanism seems to be almost identical with the one proposed for direction selectivity in the insect visual system,” he says. If true, then some of the built-in functionality of the eye was likely invented more than 500 million years ago, when insects and vertebrates shared a common ancestor.