Unknown

Dataset Information

0

Development of Smartphone Application for Markerless Three-Dimensional Motion Capture Based on Deep Learning Model


ABSTRACT: To quantitatively assess pathological gait, we developed a novel smartphone application for full-body human motion tracking in real time from markerless video-based images using a smartphone monocular camera and deep learning. As training data for deep learning, the original three-dimensional (3D) dataset comprising more than 1 million captured images from the 3D motion of 90 humanoid characters and the two-dimensional dataset of COCO 2017 were prepared. The 3D heatmap offset data consisting of 28 × 28 × 28 blocks with three red–green–blue colors at the 24 key points of the entire body motion were learned using the convolutional neural network, modified ResNet34. At each key point, the hottest spot deviating from the center of the cell was learned using the tanh function. Our new iOS application could detect the relative tri-axial coordinates of the 24 whole-body key points centered on the navel in real time without any markers for motion capture. By using the relative coordinates, the 3D angles of the neck, lumbar, bilateral hip, knee, and ankle joints were estimated. Any human motion could be quantitatively and easily assessed using a new smartphone application named Three-Dimensional Pose Tracker for Gait Test (TDPT-GT) without any body markers or multipoint cameras.

SUBMITTER: Aoyagi Y 

PROVIDER: S-EPMC9322512 | biostudies-literature |

REPOSITORIES: biostudies-literature

Similar Datasets

| S-EPMC10204374 | biostudies-literature
| S-EPMC4154302 | biostudies-literature
| S-EPMC10635560 | biostudies-literature
| S-EPMC6278687 | biostudies-literature
| S-EPMC9015782 | biostudies-literature
| S-EPMC11336506 | biostudies-literature
| S-EPMC10673844 | biostudies-literature
| S-EPMC7739760 | biostudies-literature
| S-EPMC5094601 | biostudies-literature
| S-EPMC5482943 | biostudies-literature