Multivariate Pattern Analysis Reveals A Multisensory Brain

Multivariate pattern analysis (MVPA) has become a popular tool for
characterizing the information content in functional
imaging data. Here I address the application of MVPA to the problem of
multisensory integration in the brain.
Our experience of the world is inherently multisensory: for example, we
can recognize a bell by the sight of it swinging
back and forth, by its distinctive ding-dong, or by the felt shape of
its cold metal surface. Our data demonstrate that
content-specific information can be found in multiple sensory cortices
even when stimuli are perceived through only one
modality.
For example, activity in low-level auditory cortex reflects the identity
of seen objects, and data from somatosensory
cortex can be used to predict seen touches. These findings are
consistent with a neural architecture in which high level
multisensory zones feed back to retro-activate low level sensory
cortices in a stimulus-specific manner.
Further, to identify supra-modal brain regions, we tested several
multisensory regions for the property of modality-
invariance, for example to find brain regions where the patterns evoked
by sound could predict those invoked by sight. Out
of several brain regions that responded to both sound and sight, only
one, near the junction of the temporal and parietal
lobes, displayed both content-specificity and modality-invariance. These
results suggest that temporo-parietal cortex plays
a key role in the abstract, supra-modal representation of objects.