A New Study About Colour Tries to Decode ‘The Brain’s Pantone’
How do humans perceive colour? An NIH experiment finds a way to measure what happens after light hits the eye—using brain scans.
Bevil Conway, an artist and neuroscience researcher at the National Institutes of Health, is crazy about color. He particularly loves watercolors made by the company Holbein. “They have really nice purples that you can’t get in other paints,” he says. If Conway is after a specific shade—perhaps the dark, almost-brown color the company has labeled “Mars Violet” or the more merlot-tinted “Quinacridone Violet”—he might scroll through a Holbein chart that organizes the colors by similarity. Anyone who has considered painting a wall is familiar with these arrays: lines of color that transition from bright yellows into greens, blues, purples, and browns.
But if Conway decides to shop around at another paint company like Pantone, that chart, also known as a “color space,” will be organized differently. And if he chooses to consult the Commission Internationale de l’Éclarage, an organization that researches and standardizes measurements of light and color, he will find yet another unique map. Conway is baffled by the choices. “Why are there so many different color spaces?” he asks. “If this is really reflective of something fundamental about how we see and perceive, then shouldn’t there be one color space?”
How humans perceive color, and how all those shades are related, is a question scientists and philosophers have been attempting to answer for millennia. The ancient Greeks, who famously had no word for the color blue, argued over whether colors were composed of red, black, white, and light (that was Plato’s theory), or whether color was celestial light sent down from the heavens by the gods and that each color was a mixture of white and black or lightness and darkness (that was Aristotle’s). Isaac Newton’s experiments with prisms identified the components of the rainbow and led him to theorize that the three primary colors, from which all other colors are made, are red, yellow, and blue.
Today, our scientific understanding of color perception is rooted in biology. Each color represents a specific part of the electromagnetic spectrum, though humans can only see the slice of this spectrum known as “visible light.” Of the wavelengths visible to humans, red ones are longer, while blues and violets are shorter. Photons of light stimulate photoreceptors in the eye, which transform that information into electrical signals that are sent to the retina, which processes those signals and sends them along to the brain’s visual cortex. But the mechanics of how the eye and nervous system interact with those light waves, and how a person subjectively perceives color, are two very different things.
“One way to think about neuroscience is that it’s a study of signal transformations,” writes Soumya Chatterjee, a senior scientist at the Allen Institute for Brain Science who studies the neurology of color perception, in an email to WIRED. He says that once the photoreceptors in the retina have passed information to the visual cortex, the information continues to be transformed—and scientists don’t yet understand how those series of transformations give rise to perception, or the experience an individual person has of color.
Some aspects of color can already be measured precisely. Scientists can calculate the wavelength of the light and the luminance, or brightness, of a color. But once you bring human perception into the mix, things get a little more complicated. People perceive color by factoring in a number of other variables, like the quality of the light or the other tones bordering the color. Sometimes that means the brain will perceive the same object as two completely different colors; that happened with the famous dress, which in some lights looked white and gold and in others looked blue and black.
And sometimes those brain calculations mean that two completely different inputs can elicit the same perception. Yellow light, for example, has its own specific wavelength that the brain understands as yellow. But mix a green and a red light—each of which have their own unique wavelengths—and the brain will also understand that combination to be yellow, too, even though that light’s physical properties are different from the other wavelengths we perceive to be yellow. Figuring out why our brains interpret those two different inputs as similar has been hard to puzzle out.
Now, Conway is suggesting a new method of organizing and understanding colors: by basing it on patterns of neuron activation in the brain. In a recent paper published in Current Biology, Conway was able to show that each color elicits a unique pattern of neural activity. In this study, he focused first on the brain’s response to a color, rather than on the color each of his study subjects verbally described. This approach reframes how neuroscientists typically try to answer questions about color perception. “Perception is usually taken as the known quantity, and then researchers tried to figure out the neuronal processes leading to that,” writes Chatterjee. “Here, the perceptual variable is taken as the unknown (this abstract color space), and they try to derive it based on the measured neuronal activity.”
Conway is certainly not the first to use technology to track the brain’s response to color. Previous studies have used fMRI data to capture what’s going on as a person looks at different colors—but those scans lag, so it’s hard to tell exactly what’s happening in the brain at the moment it’s interpreting those stimuli. And fMRI scans are an indirect way to track brain activity, since they measure blood flow, not actual neuron firing.
So Conway tried another method called magnetoencephalography (MEG), which uses magnetic sensors to detect the electrical activity of neurons firing. The technique is much faster than fMRI, so Conway could capture patterns of neuron firing before, during, and after his subjects looked at different colors. He had 18 volunteers take turns sitting in the MEG machine, which looks kind of like a giant retro hair dryer at a beauty salon, and showed them cards, each with a spiral that was either yellow, brown, pink, purple, green, dark green, blue, or dark blue. Then, during the MEG scan, he asked the subjects to name which color they saw.
Greg Horwitz, associate professor of physiology and biophysics at the University of Washington, says Conway was very clever about how he designed the study. Instead of using colors that we perceive as being similar, this study used colors that evoke similar reactions from the photoreceptors in the eye. For example, yellow and brown look very different to us, but they actually elicit similar responses among photoreceptors. That means that any differences in the patterns of brain activity detected by the MEG should be attributed not to the interaction between the light and the receptors in the eye, but to processing in the brain’s visual cortex. Horwitz says this shows just how complex perception is: “More complicated than photoreceptors.”
Conway next trained an artificial intelligence classifier to read the MEG results and look for similar patterns of neural activity among the 18 subjects. Then, he wanted to see if those patterns matched up with the colors the subjects reported seeing. For example, did a specific pattern of neural activity always correlate with the person saying they’d seen a dark blue spiral? “If the information can be decoded, then presumably that information is available to the rest of the brain to inform behavior,” he says.
At first, Conway was pretty skeptical that he would get any results. “The word on the street is that MEG has very crappy spatial resolution,” he says. Essentially, the machine is good at detecting when there’s brain activity, but not so great at showing you where in the brain that activity is. But as it turned out, the patterns were there and they were easy for the decoder to spot. “Lo and behold, the pattern is different enough for the different colors that I can decode with upwards of 90 percent accuracy what color you were seeing,” he says. “That’s like: holy crap!”
Chatterjee says that Conway’s MEG approach allows neuroscientists to flip traditional questions of perception upside down. “Perception is usually taken as the known quantity”—in this case, the color of the spiral—“and then researchers tried to figure out the neuronal processes leading to that,” he writes. But in this experiment, Conway approached the question from the opposite side: He measured the neuronal processes and then made conclusions about how those processes affect his subjects’ color perception.
The MEG also allowed Conway to watch perception unfold over time. In this experiment, it took about one second from the moment the volunteer saw the spiral until the moment when they named its color aloud. The machine was able to reveal activation patterns during that period, showing when color perception arose in the brain, and then track that activation for approximately another half second as the percept shifted to a semantic concept—the word the volunteer could use to name the color.
But there are some limitations to this approach. While Conway could identify that viewing different colors creates different patterns of brain responses, and that his 18 subjects experienced specific patterns for colors like yellow, brown, or light blue, he can’t say exactly where in the brain those patterns emerge. The paper also doesn’t discuss any of the mechanisms that create these patterns. But, Conway says, figuring out that there is a neural difference in the first place is huge. “That there is a difference is instructive, because it tells us that there is some kind of topographic map of color in the human brain,” he says.
“It’s that relationships between colors as we perceive them (perceptual color space) can be derived from the relationships of recorded activity (even if it’s MEG and can’t get you down to the level of single neurons or small ensembles of neurons),” writes Chatterjee. “That makes this a creative and interesting study.”
Plus, Conway says, this research refutes all those arguments that MEG isn’t precise enough to capture these patterns. “Now we can use [MEG] to decode all sorts of things related to the very fine spatial structure of neurons in the brain,” Conway suggests.
The MEG data also showed that the brain processed those eight color spirals differently depending on whether they showed warm or dark colors. Conway made sure to include pairs that were the same hue, meaning their wavelengths would be perceived as the same color by the eye’s photoceptors, but had different luminance, or brightness, levels, which changes how people perceive them. For example, yellow and brown are the same hue but differ in luminance. Both are warm colors. And, for cool colors, the blue and dark blue he picked were also the same hue as each other, and had the same difference in luminance as did the yellow/brown pair of warm tones.
The MEG data showed that the patterns of brain activity corresponding to blue and dark blue were more similar to each other than the patterns for yellow and brown were to each other. Even though these hues all differed by the same amount of luminance, the brain processed the pair of warm colors as being much more different from one another, compared to the two blues.
Conway is excited to start testing more colors and to build his own color space, categorizing the relationship between them not based on wavelength but on neural activity pattern—a concept he describes as “the brain’s Pantone.” But he isn’t entirely sure where all of this research will lead. He points out that tools like lasers, which started out as a curiosity, ended up having a multitude of applications that researchers never imagined when they started playing around with them. “What we know, historically, is that when most things that turn out to be useful, their usefulness is only apparent in retrospect,” says Conway.
While Conway’s study stopped short of being able to explain exactly where the neural patterns that code for the perception of specific colors arises, researchers believe it would be possible one day. Understanding these patterns could potentially help scientists develop visual prostheses that would restore peoples’ experience of sight, or create ways for people to communicate exactly what they perceive. Or maybe this could help teach machines how to see better and in full color, like humans do.
And on a more fundamental level, figuring out how color perception matches with neural activity is an important step toward understanding how the brain constructs our understanding of the world around us. “If you could find a brain area where the representation matched perception, that would be a huge leap,” says Horwitz. “Finding the part of the brain where the representation of color matches what we experience would be a big step towards understanding what color perception really is.”