Unknown

Dataset Information

0

Relative posture between head and finger determines perceived tactile direction of motion.


ABSTRACT: The hand explores the environment for obtaining tactile information that can be fruitfully integrated with other functions, such as vision, audition, and movement. In theory, somatosensory signals gathered by the hand are accurately mapped in the world-centered (allocentric) reference frame such that the multi-modal information signals, whether visual-tactile or motor-tactile, are perfectly aligned. However, an accumulating body of evidence indicates that the perceived tactile orientation or direction is inaccurate; yielding a surprisingly large perceptual bias. To investigate such perceptual bias, this study presented tactile motion stimuli to healthy adult participants in a variety of finger and head postures, and requested the participants to report the perceived direction of motion mapped on a video screen placed on the frontoparallel plane in front of the eyes. Experimental results showed that the perceptual bias could be divided into systematic and nonsystematic biases. Systematic bias, defined as the mean difference between the perceived and veridical directions, correlated linearly with the relative posture between the finger and the head. By contrast, nonsystematic bias, defined as minor difference in bias for different stimulus directions, was highly individualized, phase-locked to stimulus orientation presented on the skin. Overall, the present findings on systematic bias indicate that the transformation bias among the reference frames is dominated by the finger-to-head posture. Moreover, the highly individualized nature of nonsystematic bias reflects how information is obtained by the orientation-selective units in the S1 cortex.

SUBMITTER: Chen YP 

PROVIDER: S-EPMC7099024 | biostudies-literature | 2020 Mar

REPOSITORIES: biostudies-literature

altmetric image

Publications

Relative posture between head and finger determines perceived tactile direction of motion.

Chen Yueh-Peng YP   Yeh Chun-I CI   Lee Tsung-Chi TC   Huang Jian-Jia JJ   Pei Yu-Cheng YC  

Scientific reports 20200326 1


The hand explores the environment for obtaining tactile information that can be fruitfully integrated with other functions, such as vision, audition, and movement. In theory, somatosensory signals gathered by the hand are accurately mapped in the world-centered (allocentric) reference frame such that the multi-modal information signals, whether visual-tactile or motor-tactile, are perfectly aligned. However, an accumulating body of evidence indicates that the perceived tactile orientation or dir  ...[more]

Similar Datasets

| S-EPMC6803715 | biostudies-literature
| S-EPMC8086864 | biostudies-literature
| S-EPMC5242513 | biostudies-literature
| S-EPMC5247673 | biostudies-literature
| S-EPMC6772179 | biostudies-literature
| S-EPMC7472910 | biostudies-literature
| S-EPMC9772181 | biostudies-literature
| S-EPMC6294351 | biostudies-other
| S-EPMC4564331 | biostudies-literature
| S-EPMC4979522 | biostudies-literature