Kavli Affiliate: Zhuo Li
| First 5 Authors: Junjia Liu, Zhuo Li, Minghao Yu, Zhipeng Dong, Sylvain Calinon
| Summary:
Humanoid robots are envisioned as embodied intelligent agents capable of
performing a wide range of human-level loco-manipulation tasks, particularly in
scenarios requiring strenuous and repetitive labor. However, learning these
skills is challenging due to the high degrees of freedom of humanoid robots,
and collecting sufficient training data for humanoid is a laborious process.
Given the rapid introduction of new humanoid platforms, a cross-embodiment
framework that allows generalizable skill transfer is becoming increasingly
critical. To address this, we propose a transferable framework that reduces the
data bottleneck by using a unified digital human model as a common prototype
and bypassing the need for re-training on every new robot platform. The model
learns behavior primitives from human demonstrations through adversarial
imitation, and the complex robot structures are decomposed into functional
components, each trained independently and dynamically coordinated. Task
generalization is achieved through a human-object interaction graph, and skills
are transferred to different robots via embodiment-specific kinematic motion
retargeting and dynamic fine-tuning. Our framework is validated on five
humanoid robots with diverse configurations, demonstrating stable
loco-manipulation and highlighting its effectiveness in reducing data
requirements and increasing the efficiency of skill transfer across platforms.
| Search Query: ArXiv Query: search_query=au:”Zhuo Li”&id_list=&start=0&max_results=3