(Image credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0)
Scientists have combined a new camera system with open-source software to generate stunning video clips of the world as different animals see it — including the specific colors they perceive.
From more intense reds to streaks of ultraviolet, the footage shows various settings in and around a garden environment, with some colors accentuated and others dulled depending on which animal’s vision is being emulated.
The clip shows a zebra swallowtail butterfly (Protographium marcellus) foraging on flowers as a honeybee (Apis mellifera) would see it. The scientists published 12 videos in total showing how birds, bees, mice and dogs see the world.
To produce the videos, the researchers set up cameras to capture raw footage and later applied post-processing software on top to predict perceived colors in different species. This method, which they outlined in a paper published Jan. 23 in the journal PLOS Biology, is 92% accurate based on testing against conventional spectrophotometry techniques.
A zebra swallowtail butterfly foraging on flowers as a honeybee would see it. (Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0)
“We’ve long been fascinated by how animals see the world,” Daniel Hanley, senior author of the study and an assistant professor of biology at George Mason University in Virginia, said in a statement. “Modern techniques in sensory ecology allow us to infer how static scenes might appear to an animal; however, animals often make crucial decisions on moving targets (e.g., detecting food items, evaluating a potential mate’s display, etc.). Here, we introduce hardware and software tools for ecologists and filmmakers that can capture and display animal-perceived colors in motion.”
Species see the world differently in part due to photoreceptors in their eyes and the neural architecture of their brains. The eyes of dogs, for example, are structured similarly to people with red-green color blindness. Insects like honeybees, meanwhile, can see ultraviolet light, the scientists said in their paper.
To better understand how animals see the world, researchers have devised various methods to accurately reproduce the colors the animals see, but these techniques have only been capable of generating still images.
Spectrophotometry, for example, works by using object-reflected light to estimate what animals’ photoreceptors pick up on. These methods have only produced still images so far, they can’t infer spatial information and they are highly time-consuming, the scientists said. Meanwhile, multispectral photography, which relies on taking a series of photos in several wavelength ranges, trades accuracy for more spatial information — but this method works only on still objects.
To get around these limitations, researchers created this new system by acquiring commercially available Sony a6400 cameras and configuring them to record in four color channels — red, green, blue and ultraviolet — simultaneously.
Next, they affixed the cameras to a 3D-printed structure, comprising various pieces of photography equipment, including a modular cage, mounts for a beam splitter mirror and cone baffles (which minimize light leakage toward the camera).
This was the first step in a pipeline that began with capturing raw footage and ended in rendering the finished clips. To render the video in animal-perceived colors, the researchers applied the video2vision software — a set of transformation functions — to the raw footage. Then, they processed the data into “perceptual units,” akin to photo filters, and fine-tuned each one based on our existing knowledge of the respective species’ photoreceptors to accurately predict what each animal might be seeing.
Scientists and filmmakers who study animals can use this setup to capture and process their own footage, the researchers said. In particular, watching footage with these animal-vision filters applied can tell us more about how particular species interact with their environment and respond to stimuli.