Normal-guided Garment UV Prediction for Human Re-texturing

Kavli Affiliate: Yi Zhou

| First 5 Authors: [#item_custom_name[1]], [#item_custom_name[2]], [#item_custom_name[3]], [#item_custom_name[4]], [#item_custom_name[5]]

| Summary:

Clothes undergo complex geometric deformations, which lead to appearance
changes. To edit human videos in a physically plausible way, a texture map must
take into account not only the garment transformation induced by the body
movements and clothes fitting, but also its 3D fine-grained surface geometry.
This poses, however, a new challenge of 3D reconstruction of dynamic clothes
from an image or a video. In this paper, we show that it is possible to edit
dressed human images and videos without 3D reconstruction. We estimate a
geometry aware texture map between the garment region in an image and the
texture space, a.k.a, UV map. Our UV map is designed to preserve isometry with
respect to the underlying 3D surface by making use of the 3D surface normals
predicted from the image. Our approach captures the underlying geometry of the
garment in a self-supervised way, requiring no ground truth annotation of UV
maps and can be readily extended to predict temporally coherent UV maps. We
demonstrate that our method outperforms the state-of-the-art human UV map
estimation approaches on both real and synthetic data.

| Search Query: [#feed_custom_title]

Read More