Human-Robot Variable Impedance Skill Transfer Learning Based on Dynamic Movement Primitives and Vision System

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

To enhance robotic adaptability in dynamic environments, this study proposes a multimodal framework for skill transfer. The framework integrates vision-based kinesthetic teaching with surface electromyography (sEMG) signals to estimate human impedance. We establish a Cartesian-space model of upper-limb stiffness, linearly mapping sEMG signals to endpoint stiffness. For flexible task execution, dynamic movement primitives (DMPs) generalize learned skills across varying scenarios. An adaptive admittance controller, incorporating sEMG-modulated stiffness, is developed and validated on a UR5 robot. Experiments involving elastic band stretching demonstrate that the system successfully transfers human impedance characteristics to the robot, enhancing stability, environmental adaptability, and safety during physical interaction.

Article activity feed