Heterogeneous representations in the superior parietal lobule are common across reaches to visual and proprioceptive targets.
Ontology highlight
ABSTRACT: The planning and control of sensory-guided movements requires the integration of multiple sensory streams. Although the information conveyed by different sensory modalities is often overlapping, the shared information is represented differently across modalities during the early stages of cortical processing. We ask how these diverse sensory signals are represented in multimodal sensorimotor areas of cortex in macaque monkeys. Although a common modality-independent representation might facilitate downstream readout, previous studies have found that modality-specific representations in multimodal cortex reflect upstream spatial representations. For example, visual signals have a more eye-centered representation. We recorded neural activity from two parietal areas involved in reach planning, area 5 and the medial intraparietal area (MIP), as animals reached to visual, combined visual and proprioceptive, and proprioceptive targets while fixing their gaze on another location. In contrast to other multimodal cortical areas, the same spatial representations are used to represent visual and proprioceptive signals in both area 5 and MIP. However, these representations are heterogeneous. Although we observed a posterior-to-anterior gradient in population responses in parietal cortex, from more eye-centered to more hand- or body-centered representations, we do not observe the simple and discrete reference frame representations suggested by studies that focused on identifying the "best-match" reference frame for a given cortical area. In summary, we find modality-independent representations of spatial information in parietal cortex, although these representations are complex and heterogeneous.
SUBMITTER: McGuire LM
PROVIDER: S-EPMC3100795 | biostudies-other | 2011 May
REPOSITORIES: biostudies-other
ACCESS DATA