Neurally-Guided Texturing for Garment Line Drawings

3Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Adding texture to a line drawing is an important process in the production of comics and illustrations. Garment drawings especially often have large deformations with self-occlusions, so deforming patterns is essential for representing realistic garment designs. However, it is currently done manually and requires a significant amount of effort by experts. A possible approach is to infer 3D surface geometry and then apply texture to all 3D surfaces, but it is difficult to represent deep creases using this approach. In this paper, we introduce a "neurally-guided"optimization system for automatically deforming and directly mapping 2D texture patterns to 2D line drawings, bypassing 3D geometry. First, we build a deep neural network to estimate local transformation matrices of texture patterns, called neural-guidance, from line drawings. Second, we build a 2D triangle mesh for the garment and deforms the mesh to obtain the texture coordinates by integrating the local transformations. Our algorithm is effective and easy to integrate into existing drawing systems. We provide several examples to demonstrate the efficiency of our proposed system over previous methods and illustrate the versatility of our method.

Cite

CITATION STYLE

APA

Hashimoto, M., Fukusato, T., & Igarashi, T. (2020). Neurally-Guided Texturing for Garment Line Drawings. In SIGGRAPH Asia 2020 Technical Communications, SA 2020. Association for Computing Machinery, Inc. https://doi.org/10.1145/3410700.3425428

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free