Unknown

Dataset Information

0

Correspondence of categorical and feature-based representations of music in the human brain.


ABSTRACT:

Introduction

Humans tend to categorize auditory stimuli into discrete classes, such as animal species, language, musical instrument, and music genre. Of these, music genre is a frequently used dimension of human music preference and is determined based on the categorization of complex auditory stimuli. Neuroimaging studies have reported that the superior temporal gyrus (STG) is involved in response to general music-related features. However, there is considerable uncertainty over how discrete music categories are represented in the brain and which acoustic features are more suited for explaining such representations.

Methods

We used a total of 540 music clips to examine comprehensive cortical representations and the functional organization of music genre categories. For this purpose, we applied a voxel-wise modeling approach to music-evoked brain activity measured using functional magnetic resonance imaging. In addition, we introduced a novel technique for feature-brain similarity analysis and assessed how discrete music categories are represented based on the cortical response pattern to acoustic features.

Results

Our findings indicated distinct cortical organizations for different music genres in the bilateral STG, and they revealed representational relationships between different music genres. On comparing different acoustic feature models, we found that these representations of music genres could be explained largely by a biologically plausible spectro-temporal modulation-transfer function model.

Conclusion

Our findings have elucidated the quantitative representation of music genres in the human cortex, indicating the possibility of modeling this categorization of complex auditory stimuli based on brain activity.

SUBMITTER: Nakai T 

PROVIDER: S-EPMC7821620 | biostudies-literature | 2021 Jan

REPOSITORIES: biostudies-literature

altmetric image

Publications

Correspondence of categorical and feature-based representations of music in the human brain.

Nakai Tomoya T   Koide-Majima Naoko N   Nishimoto Shinji S  

Brain and behavior 20201108 1


<h4>Introduction</h4>Humans tend to categorize auditory stimuli into discrete classes, such as animal species, language, musical instrument, and music genre. Of these, music genre is a frequently used dimension of human music preference and is determined based on the categorization of complex auditory stimuli. Neuroimaging studies have reported that the superior temporal gyrus (STG) is involved in response to general music-related features. However, there is considerable uncertainty over how dis  ...[more]

Similar Datasets

| S-EPMC3529057 | biostudies-literature
| S-EPMC4158364 | biostudies-literature
| S-EPMC9992300 | biostudies-literature
| S-EPMC6800382 | biostudies-literature
| S-EPMC2803114 | biostudies-literature
| S-EPMC8195411 | biostudies-literature
| S-EPMC8046073 | biostudies-literature
| S-EPMC3060658 | biostudies-other
| S-EPMC5509941 | biostudies-other
| S-EPMC5811520 | biostudies-other