At first, Conway was pretty skeptical that he would get any results. “The word on the street is that MEG has very crappy spatial resolution,” he says. Essentially, the machine is good at detecting when there’s brain activity, but not so great at showing you where in the brain that activity is. But as it turned out, the patterns were there and they were easy for the decoder to spot. “Lo and behold, the pattern is different enough for the different colors that I can decode with upwards of 90 percent accuracy what color you were seeing,” he says. “That’s like: holy crap!”
Chatterjee says that Conway’s MEG approach allows neuroscientists to flip traditional questions of perception upside down. “Perception is usually taken as the known quantity”—in this case, the color of the spiral—“and then researchers tried to figure out the neuronal processes leading to that,” he writes. But in this experiment, Conway approached the question from the opposite side: He measured the neuronal processes and then made conclusions about how those processes affect his subjects’ color perception.
The MEG also allowed Conway to watch perception unfold over time. In this experiment, it took about one second from the moment the volunteer saw the spiral until the moment when they named its color aloud. The machine was able to reveal activation patterns during that period, showing when color perception arose in the brain, and then track that activation for approximately another half second as the percept shifted to a semantic concept—the word the volunteer could use to name the color.
But there are some limitations to this approach. While Conway could identify that viewing different colors creates different patterns of brain responses, and that his 18 subjects experienced specific patterns for colors like yellow, brown, or light blue, he can’t say exactly where in the brain those patterns emerge. The paper also doesn’t discuss any of the mechanisms that create these patterns. But, Conway says, figuring out that there is a neural difference in the first place is huge. “That there is a difference is instructive, because it tells us that there is some kind of topographic map of color in the human brain,” he says.
“It’s that relationships between colors as we perceive them (perceptual color space) can be derived from the relationships of recorded activity (even if it’s MEG and can’t get you down to the level of single neurons or small ensembles of neurons),” writes Chatterjee. “That makes this a creative and interesting study.”
Plus, Conway says, this research refutes all those arguments that MEG isn’t precise enough to capture these patterns. “Now we can use [MEG] to decode all sorts of things related to the very fine spatial structure of neurons in the brain,” Conway suggests.
The MEG data also showed that the brain processed those eight color spirals differently depending on whether they showed warm or dark colors. Conway made sure to include pairs that were the same hue, meaning their wavelengths would be perceived as the same color by the eye’s photoceptors, but had different luminance, or brightness, levels, which changes how people perceive them. For example, yellow and brown are the same hue but differ in luminance. Both are warm colors. And, for cool colors, the blue and dark blue he picked were also the same hue as each other, and had the same difference in luminance as did the yellow/brown pair of warm tones.
The MEG data showed that the patterns of brain activity corresponding to blue and dark blue were more similar to each other than the patterns for yellow and brown were to each other. Even though these hues all differed by the same amount of luminance, the brain processed the pair of warm colors as being much more different from one another, compared to the two blues.
Leave a Reply