locomanip-duet.github.io - RoboDuet: A Framework Affording Mobile-Manipulation and Cross-Embodiment

Description: RoboDuet: A Framework Affording Mobile-Manipulation and Cross-Embodiment

mobile manipulation (8) legged robot (2) whole body control (2) 6-dof tracking (1)

Example domain paragraphs

YouTube --> arXiv Paper Code (coming soon) Abstract Combining the mobility of legged robots with the manipulation skills of arms has the potential to significantly expand the operational range and enhance the capabilities of robotic systems in performing various mobile manipulation tasks. Existing approaches are confined to imprecise 6-DoF manipulation and possess a limited arm workspace. In this paper, we propose a novel framework — Robo Duet , which employs two collaborative policies to realize locomotion

Two stage training. In order to achieve both robust locomotion ability and flexible manipulation ability, we adopted a two-stage training strategy. Stage 1 focuses on obtaining the robust locomotion capability, which design is inspired by the powerful blind locomotion algorithm. Stage 2 aims to coordinate locomotion and manipulation to achieve whole-body large-range mobile manipulation, when the arm policy will be activated simultaneously with all the robotic arm joints.

To validate the significance of the two-stage training and the cooperative policy, which are key components of RoboDuet, we establish a Baseline algorithm training a unified policy in one-stage. The Two-Stage algorithm modifies this baseline by transitioning from one-stage to two-stage training, while the Cooperated algorithm builds on the baseline by replacing the unified policy with a cooperative policy. RoboDuet itself incorporates both two-stage training and cooperative policy. Training details for thes

Links to locomanip-duet.github.io (1)