High-Fidelity Texture Transfer Using
Multi-Scale Depth-Aware Diffusion
Rongzhen Lin,
Zichong Chen,
Xiaoyong Hao,
Yang Zhou*,
Hui Huang
Visual Computing Research Center, CSSE, Shenzhen University
Eurographics Symposium on Rendering 2025
*Corresponding Author
[Paper(Coming soon)]
[Code(Coming soon)]
Our method can successfully transfer the texture from a reference shape to other shapes with significantly variant topologies (left). It can also texture the same shape with different source textures from other 3D meshes (right).

Abstract

Textures are a key component of 3D assets. Transferring textures from one shape to another, without user interaction or additional semantic guidance, is a classical yet challenging problem. It can enhance the diversity of existing shape collections, augmenting their application scope. This paper proposes an innovative 3D texture transfer framework that leverages the generative power of pre-trained diffusion models. While diffusion models have achieved significant success in 2D image generation, their application to 3D domains faces great challenges in preserving coherence across different viewpoints. Addressing this issue, we designed a multi-scale generation framework to optimize the UV maps coarse-to-fine. To ensure multi-view consistency, we use depth info as geometric guidance; meanwhile, a novel consistency loss is proposed to further constrain the color coherence and reduce artifacts. Experimental results demonstrate that our multi-scale framework not only produces high-quality texture transfer results but also excels in handling complex shapes while preserving correct semantic correspondences. Compared to existing techniques, our method achieves improvements in both consistency and texture clarity, as well as time efficiency.


Method


A results gallery of our method



Qualitative comparison with state-of-the-art diffusion-based methods

Qualitative comparison with state-of-the-art diffusion-based texture generation methods. Our method not only produces high-fidelity textures but also preserves correct semantic correspondences to the reference. In contrast, the other methods either lose fine details or are not coherent to the reference texture.


Paper and Supplementary Material

Rongzhen Lin, Zichong Chen, Xiaoyong Hao, Yang Zhou, Hui Huang.
High-Fidelity Texture Transfer Using Multi-Scale Depth-Aware Diffusion.
Eurographics Symposium on Rendering 2025.



[Bibtex]


Acknowledgements

This work was supported in parts by National Key R&D Program of China (2024YFB3908500, 2024YFB3908502, 2024 YFB3908505), DEGP Innovation Team (2022KCXTD025), SZU Teaching Reform Key Program (JG2024018), and Scientific Development Funds from Shenzhen University.
This template was originally made by Phillip Isola and Richard Zhang for a colorful ECCV project; the code can be found here.