Omnidirectional Walking Pattern Generator Combining Virtual Constraints and Preview Control for Humanoid Robots.
Ontology highlight
ABSTRACT: This paper presents a novel omnidirectional walking pattern generator for bipedal locomotion combining two structurally different approaches based on the virtual constraints and the preview control theories to generate a flexible gait that can be modified on-line. The proposed strategy synchronizes the displacement of the robot along the two planes of walking: the zero moment point based preview control is responsible for the lateral component of the gait, while the sagittal motion is generated by a more dynamical approach based on virtual constraints. The resulting algorithm is characterized by a low computational complexity and high flexibility, requisite for a successful deployment to humanoid robots operating in real world scenarios. This solution is motivated by observations in biomechanics showing how during a nominal gait the dynamic motion of the human walk is mainly generated along the sagittal plane. We describe the implementation of the algorithm and we detail the strategy chosen to enable omnidirectionality and on-line gait tuning. Finally, we validate our strategy through simulation experiments using the COMAN + platform, an adult size humanoid robot developed at Istituto Italiano di Tecnologia. Finally, the hybrid walking pattern generator is implemented on real hardware, demonstrating promising results: the WPG trajectories results in open-loop stable walking in the absence of external disturbances.
SUBMITTER: Ruscelli F
PROVIDER: S-EPMC8284058 | biostudies-literature |
REPOSITORIES: biostudies-literature
ACCESS DATA