Look at that Bot Go!: a Framework for Differentiating Humanoid Robot Locomotion

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper presents a software framework for multi-modal locomotion on the DARwIn OP humanoid platform. The system combines an omnidirectional walking controller based on Central Pattern Generator (CPG) principles with a motion manager that executes pre-programmed keyframe sequences for alternative locomotion modes including crawling, handstands, and hopping. Our approach demonstrates that humanoid robots can take advantage of their anthropomorphic form factor to perform maneuvers beyond standard bipedal walking. The system was tested in the Webots simulator, showing successful forward crawling capabilities while revealing limitations in backward crawling and hopping. This work contributes to expanding the operational versatility of humanoid robots in constrained environments. To get a better idea of how the robot moves in simulation, look at the website here: https://sites.google.com/view/look-at-that-bot-go/main-project

Article activity feed