SuperPADL: Scaling Language-Directed Physics-Based Control with Progressive Supervised Distillation

ACM SIGGRAPH 2024

Jordan Juravsky (1, 2)    Yunrong Guo (2)    Sanja Fidler (2, 3)    Xue Bin Peng (2, 4)

(1) Stanford    (2) NVIDIA    (3) University of Toronto    (4) Simon Fraser University



Abstract

Physically-simulated models for human motion can generate highquality responsive character animations, often in real-time. Natural language serves as a flexible interface for controlling these models, allowing expert and non-expert users to quickly create and edit their animations. Many recent physics-based animation methods, including those that use text interfaces, train control policies using reinforcement learning (RL). However, scaling these methods beyond several hundred motions has remained challenging. Meanwhile, kinematic animation models are able to successfully learn from thousands of diverse motions by leveraging supervised learning methods. Inspired by these successes, in this work we introduce SuperPADL, a scalable framework for physics-based textto- motion that leverages both RL and supervised learning to train controllers on thousands of diverse motion clips. SuperPADL is trained in stages using progressive distillation, starting with a large number of specialized experts using RL. These experts are then iteratively distilled into larger, more robust policies using a combination of reinforcement learning and supervised learning. Our final SuperPADL controller is trained on a dataset containing over 5000 skills and runs in real time on a consumer GPU. Moreover, our policy can naturally transition between skills, allowing for users to interactively craft multi-stage animations. We experimentally demonstrate that SuperPADL significantly outperforms RL-based baselines at this large data scale.

Paper: [PDF]       Webpage: [Link]       Preprint: [arXiv]

Video



Bibtex

@inproceedings{
    juravsky2024superpadl,
    title = {SuperPADL: Scaling Language-Directed Physics-Based Control with Progressive Supervised Distillation},
    author = {Jordan Juravsky and Yunrong Guo and Sanja Fidler and Xue Bin Peng},
    booktitle = {SIGGRAPH 2024 Conference Papers (SIGGRAPH '24 Conference Papers),},
    year = {2024}