TMO: Textured Mesh Acquisition of Objects with a Mobile Device by using Differentiable Rendering

Jaehoon Choi1,2, Dongki Jung1, Taejae Lee1, Sangwook Kim1, Youngdong Jung1, Dinesh Manocha2, Donghwan Lee1
1NAVER LABS 2University of Maryland

CVPR 2023

Nerfies

TMO reconstructs the high-quality geometric mesh with a visually realistic texture.

Abstract

We present a new pipeline for acquiring a textured mesh in the wild with a single smartphone which offers access to images, depth maps, and valid poses. Our method first introduces an RGBD-aided structure from motion, which can yield filtered depth maps and refines camera poses guided by corresponding depth. Then, we adopt the neural implicit surface reconstruction method, which allows for high-quality mesh and develops a new training process for applying a regularization provided by classical multi-view stereo methods. Moreover, we apply a differentiable rendering to fine-tune incomplete texture maps and generate textures which are perceptually closer to the original scene. Our pipeline can be applied to any common objects in the real world without the need for either in-the-lab environments or accurate mask images. We demonstrate results of captured objects with complex shapes and validate our method numerically against existing 3D reconstruction and texture mapping methods.

AR-capture


ARKit-video


BibTeX

@inproceedings{choi2023tmo,
    title = {TMO: Textured Mesh Acquisition of Objects with a Mobile Device by using Differentiable Rendering},
    author = {Jaehoon Choi and Dongki Jung and Taejae Lee and Sangwook Kim and Youngdon Jung and Dinesh Manocha and Donghwan Lee},
    booktitle = {CVPR},
    year = {2023}
}