Among various interactions between humans, such as eye contact and gestures, physical interactions by contact can act as an essential moment in understanding human behaviors. Inspired by this fact, given a 3D partner human with the desired interaction label, we introduce a new task of 3D human generation in terms of physical contact. Unlike previous works of interacting with static objects or scenes, a given partner human can have diverse poses and different contact regions according to the type of interaction. To handle this challenge, we propose a novel method of generating interactive 3D humans for a given partner human based on a guided diffusion framework (ContactGen in short). Specifically, we newly present a contact prediction module that adaptively estimates potential contact regions between two input humans according to the interaction label. Using the estimated potential contact regions as complementary guidances, we dynamically enforce ContactGen to generate interactive 3D humans for a given partner human within a guided diffusion model. We demonstrate ContactGen on the CHI3D dataset, where our method generates physically plausible and diverse poses compared to comparison methods. Source code is available at https://dongjunku.github.io/contactgen.
CITATION STYLE
Gu, D., Shim, J., Jang, J., Kang, C., & Joo, K. (2024). ContactGen: Contact-Guided Interactive 3D Human Generation for Partners. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 38, pp. 1923–1931). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v38i3.27962
Mendeley helps you to discover research relevant for your work.