Skip to main content

Fruit Flies Plug into the Matrix

A new budget-friendly virtual-reality system helps researchers study the brains of small animals

Colored scanning electron micrograph of newly hatched zebra fish.

Bugs and fish don’t play video games or attend teleconferences, but they can still explore virtual reality—complete with visual effects, tastes and smells. A new system called PiVR—named after the low-cost Raspberry Pi computer that runs its software—creates working artificial environments for small animals such as zebra fish larvae and fruit flies. Developers say the system’s affordability could help expand research into animal behavior.

PiVR’s purpose is not to get these creatures plugged into the Matrix. Rather it lets scientists measure an animal’s behavior in real time while it responds to a controlled environment. The technology both provides the environment and tracks the animal within it using cameras and other sensors. This approach is useful in experiments aiming to learn more about how an external stimulus spurs the brain to perform an action. “What the tracker allows us to do is know what the animal is currently doing and then adapt the type of stimulation,” says Matthieu Louis, a biologist at the University of California, Santa Barbara, and a co-author of the study.As a result, “the subject now has the ability to make choices. Its actions lead to outcomes,” explains Alexandra Moore, a graduate student in neurobiology at Harvard Medical School, who was not involved in the new research. “And that kind of experimental situation is ... essential for starting to understand how brains accomplish more sophisticated kinds of cognition.”

With PiVR, the stimulus takes the form of light, which brightens or dims depending on where the animal goes—as if it were moving toward or away from a virtual light source or in and out of virtual shadows. Say researchers want to see how a zebra fish larva behaves in the presence of a round spotlight that is brightest at its center. They can place the subject in PiVR’s chamber, which automatically turns up the brightness as the animal moves toward the area designated as the center of the “spotlight” and dims as it moves away. As the larva reacts to these changes, the chamber tracks its every move with cameras and other sensors. Doing so lets the researchers study how animals use visual stimuli to navigate. The system was described in an open-access paper published in PLOS Biology this past summer.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Light alone can only create simple environments. But by combining PiVR with a field called optogenetics, the researchers produced a much more complex virtual world. Scientists can use optogenetics to hack an animal’s brain to make it interpret light as a different type of sensory input. To do so, they manipulate the creature’s genes to put light-sensitive proteins in its neurons so that those cells will fire when exposed to a certain wavelength. If these modified neurons control a fruit fly’s sense of smell or taste, for instance, switching on the right kind of light can trick the insect into thinking it is sensing something bitter or sweet. In the example of a VR system that creates an imaginary spotlight, this technique would be like putting the animal in the presence of a smell that grows more intense as it moves toward the brightest part of the circle. “You can create virtual realities for the olfactory system or for the gustatory system in adult fruit flies,” explains David Tadres, a Ph.D. student in Louis’s lab and first author of the PiVR paper. “So you can then study ‘How do animals navigate in an olfactory or a gustatory environment?’”

The U.C. Santa Barbara team is not the only group to develop virtual reality for small animals. Researchers—such as Iain Couzin, director of the Max Planck Institute of Animal Behavior’s Department of Collective Behavior at the University of Konstanz in Germany—have set up experiments that, for example, enable real predatory fish to chase virtual prey. Couzin, who was not involved in the PiVR study, explains that “other virtual-reality approaches, which are highly complementary to the methodology here, have been used—including in my research group—to embed organisms, including flies, into fully immersive and photorealistic virtual environments where they can move and interact with 3-D environments.”

Creating such complex virtual environments can get expensive. Louis’s team previously developed a system that would cost about $50,000 to replicate. But the parts for a PiVR device can be purchased and 3-D printed for less than $500. “The achievement with PiVR is to make it such that it would be affordable—that we would not use cameras and lenses and a setup that would be very expensive,” Louis says. In addition to the cheap parts, “we wanted [the software] to fit on a mini computer that would be relatively cheap,” he says. “And that’s what the Raspberry Pi allowed us to do.”

The low cost could make it easier for a single lab to afford building and running multiple PiVR systems. “You just want to run a lot of experiments at the same time,” Tadres says, “because that's how you get much more data.” The affordable equipment (and the fact that PiVR is described in an open-access paper) also helps make the tool accessible to more researchers. Tadres suggests undergraduate and high school students could use it as well.

Other researchers agree. “The most exciting part for me is that I think [PiVR] has the ability to bring these kinds of concepts that are really, truly at the very forefront of neuroscience research today into classrooms,” Moore says. “Just the flexibility and the affordability of the system—it’s open source, it’s written in a very easy programming language—can help students understand these advanced concepts like ‘How does spatial navigation work?’ ‘How do sensory signals that we receive guide our actions?’ ‘How do we make decisions?’”

“It is incredibly important to provide cost-effective, powerful tools for scientific inquiry,” Couzin says. He sees low-price systems such as PiVR as a complement to his own lab’s work. “In this way, we can much better [democratize] the scientific process by making cutting-edge science available to a much broader community,” Couzin says.

Sophie Bushwick is tech editor at Scientific American. She runs the daily technology news coverage for the website, writes about everything from artificial intelligence to jumping robots for both digital and print publication, records YouTube and TikTok videos and hosts the podcast Tech, Quickly. Bushwick also makes frequent appearances on radio shows such as Science Friday and television networks, including CBS, MSNBC and National Geographic. She has more than a decade of experience as a science journalist based in New York City and previously worked at outlets such as Popular Science,Discover and Gizmodo. Follow Bushwick on X (formerly Twitter) @sophiebushwick

More by Sophie Bushwick