Real-Time Style Modelling of Human Locomotion via Feature-Wise Transformations and Local Motion Phases

44Citations
Citations of this article
46Readers
Mendeley users who have this article in their library.

Abstract

Controlling the manner in which a character moves in a real-time animation system is a challenging task with useful applications. Existing style transfer systems require access to a reference content motion clip, however, in real-time systems the future motion content is unknown and liable to change with user input. In this work we present a style modelling system that uses an animation synthesis network to model motion content based on local motion phases. An additional style modulation network uses feature-wise transformations to modulate style in real-time. To evaluate our method, we create and release a new style modelling dataset, 100STYLE, containing over 4 million frames of stylised locomotion data in 100 different styles that present a number of challenges for existing systems. To model these styles, we extend the local phase calculation with a contact-free formulation. In comparison to other methods for real-time style modelling, we show our system is more robust and efficient in its style representation while improving motion quality.

Cite

CITATION STYLE

APA

Mason, I., Starke, S., & Komura, T. (2022). Real-Time Style Modelling of Human Locomotion via Feature-Wise Transformations and Local Motion Phases. Proceedings of the ACM on Computer Graphics and Interactive Techniques, 5(1). https://doi.org/10.1145/3522618

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free