Researchers using functional magnetic resonance imaging (fMRI) to define neural signatures of pain in the brain face a daunting task, because pain elicits complex and distributed activity in many of the same brain regions as other intense sensations. Now, Giandomenico Iannetti, University College London, UK, and colleagues have identified fine-grained differences in brain activity, scattered across multiple primary sensory regions, that distinguish pain compared to the response to other strong sensory stimuli such as loud noise and bright light.
“There is something specific” in brain activity during pain compared to other sensations, said Iannetti. “But that something is very, very tiny.”
Iannetti’s group found the pain-specific patterns in some unlikely places: Even the visual and auditory cortices—regions thought to be dedicated to sight and hearing, respectively—showed pain signatures. That means that processing of pain and other sensory input may be much more widely distributed than previously appreciated. “It’s not just the commonly thought brain areas that are representing the senses,” said first author Meng Liang.
The results support the idea that, in the quest to understand the brain in pain, researchers cannot confine themselves to looking at bulk changes in brain activity in broad areas typically associated with pain, but will need to look for more discrete, spatially defined signals all over the brain.
The study appeared June 11 in Nature Communications.
Painful stimuli evoke activity in regions of the brain involved in sensory processing as well as other functions such as attention, emotion, and consciousness—areas collectively known as the “pain matrix.” In the last several years, researchers have begun to use sophisticated, multivariate pattern analysis techniques to identify complex patterns of activity in these regions that correlate with pain (see PRF related news story).
But Iannetti and others have argued that much of the activity ascribed to pain is not specific, but instead reflects a more general brain response to any attention-grabbing, highly salient sensory stimulus (for a review, see Legrain et al., 2011). In a study published last year, his group identified an electroencephalographic (EEG) response in the primary somatosensory cortex (namely, gamma band oscillations) that is more closely correlated with the intensity of pain, independent of salience (Zhang et al., 2012). In their new study, they pushed on to see if they could identify fMRI signals that defined the quality of a highly salient stimulus—for example, pain versus a loud noise.
To do that, the team analyzed fMRI data from an earlier experiment in which they recorded blood oxygen level-dependent (BOLD) signal changes in the primary somatosensory, visual, and auditory cortices after four isolated, intense stimuli: painful heat, non-painful touch, bright light, and loud noise. The fMRI signal was parceled into volumetric pixels (voxels) 3x3x3 mm in size. Using the traditional method of voxel-by-voxel analysis, the researchers previously found that brain responses to the different sensations were almost indistinguishable (Mouraux et al., 2011). This time they used multivariate pattern analysis to search for subtle changes in the spatial pattern of activity across voxels.
In the experiment, 14 healthy young adults underwent four testing runs; in each, they received 32 stimuli (eight of each modality), followed by a brain scan a few seconds after each stimulus. The researchers used the data from the first three runs to train an algorithm to identify spatial patterns of neural activity that correlated with stimulus modality in each subject. Then they tested whether the algorithm could recognize the kind of stimulus applied in the fourth run.
The algorithm consistently achieved above-chance accuracy. For example, patterns of activity changes in the primary somatosensory cortex (S1) distinguished pain from touch, with 63 percent accuracy on average. That success made sense, since S1 is known to be involved in both pain and touch sensation. But, to the researchers’ surprise, even the visual and auditory cortices (V1 and A1) showed activity patterns that were different for pain compared to touch, and activity in either region distinguished between the two stimuli with 59 percent accuracy.
The team used two strategies to ensure that the patterns reflected stimulus modality, not just strength—that is, that the visual cortex was not registering pain simply because the painful stimulus evoked an especially large overall brain response. During testing, the investigators asked subjects to rate the salience of each stimulus and then tuned the intensity for each subject so that each stimulus was equally salient. And, in the data analysis, they normalized responses in each cortical region to the mean response amplitude in that region to subtract out overall differences in response magnitude.
The predictive brain activity that emerged from the analysis was scattered around each sensory region. For example, in V1, activity in a diffuse sprinkling of voxels predicted whether a stimulus was pain versus touch, whereas a distinct collection of voxels distinguished pain from sound—indicating that different sensations elicit activity in different neurons, even within non-corresponding sensory cortices.
Now, the Iannetti group is trying to determine just how much information about a sensation, such as its intensity and frequency, is encoded in the different sensory cortices. Already, in the current paper, they report that activity in non-corresponding sensory cortices weakly predicts one additional piece of information: stimulus location. Activity patterns in the auditory cortex, for example, discriminated with above-chance accuracy (53 percent) whether a touch stimulus occurred on the second or fifth finger.
The discovery that sensations elicit specific activity in seemingly irrelevant cortical regions was a surprise to the group. “We couldn’t even imagine that could be possible,” Iannetti said. The results, he said, are “evidence that the dogma that the primary sensory cortices are only able to respond to stimuli of their own modality is not correct.” Instead, it appears that “primary sensory cortices are deeply multimodal.”
The questions remain of why activity in the visual and auditory cortices correlates with pain, and whether those regions actually help to produce the sensation of pain. “With these multivariate pattern analyses, there are lots of signals we can pick up on, and some of them are causally related to the percepts, and others are not,” said Tor Wager, University of Colorado, Boulder, US, who was not involved in the study. One possible explanation for pain-associated patterns in the visual cortex, he said, is that “there’s a withdrawal of attention from the visual world when you’re experiencing pain.”
Recently, Wager and colleagues used a similar multivariate pattern analysis technique to elucidate a signature pattern of activity, in the pain matrix and other regions, that marked the response to painful heat in healthy subjects. That activity pattern discriminated between pain and other kinds of salient events, such as viewing emotionally evocative pictures (see PRF related news story).
Wager said the Iannetti group’s paper is another example of how sophisticated analysis techniques can successfully tease out stimulus-specific patterns. And results such as these make Wager hopeful that the field will be able to identify signature patterns of brain activity for pain and other sensations that will be useful as biomarkers or diagnostic tools.
Iannetti is less optimistic. “A key point is that we do this within subjects,” he said. In Wager’s recent study, a specific brain activity pattern registered pain intensity and discriminated between pain and some other salient events across subjects. But Iannetti and Liang said that distinguishing the sensory modality (e.g., pain vs. sound) across subjects may be another matter. “The features allowing us to classify what the stimulus was are so small, and so discreet, and so scattered, that I would be very surprised if that works when comparing different brains,” Iannetti said. He said he and Liang are now testing that idea.
Top image: The primary visual cortex (V1). Credit: Meng Liang and Giandomenico Iannetti, University College London, UK.