Shared Control of Bimanual Robotic Limbs With a Brain-Machine Interface for Self-Feeding.
Ontology highlight
ABSTRACT: Advances in intelligent robotic systems and brain-machine interfaces (BMI) have helped restore functionality and independence to individuals living with sensorimotor deficits; however, tasks requiring bimanual coordination and fine manipulation continue to remain unsolved given the technical complexity of controlling multiple degrees of freedom (DOF) across multiple limbs in a coordinated way through a user input. To address this challenge, we implemented a collaborative shared control strategy to manipulate and coordinate two Modular Prosthetic Limbs (MPL) for performing a bimanual self-feeding task. A human participant with microelectrode arrays in sensorimotor brain regions provided commands to both MPLs to perform the self-feeding task, which included bimanual cutting. Motor commands were decoded from bilateral neural signals to control up to two DOFs on each MPL at a time. The shared control strategy enabled the participant to map his four-DOF control inputs, two per hand, to as many as 12 DOFs for specifying robot end effector position and orientation. Using neurally-driven shared control, the participant successfully and simultaneously controlled movements of both robotic limbs to cut and eat food in a complex bimanual self-feeding task. This demonstration of bimanual robotic system control via a BMI in collaboration with intelligent robot behavior has major implications for restoring complex movement behaviors for those living with sensorimotor deficits.
SUBMITTER: Handelman DA
PROVIDER: S-EPMC9274256 | biostudies-literature |
REPOSITORIES: biostudies-literature
ACCESS DATA