Mural inpainting via two-stage generative adversarial network

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The digital restoration of Dunhuang murals is of extremely high value for the research and dissemination of mural culture and art. However, a large number of murals have been damaged to varying degrees due to factors such as climate change, sand erosion, and human destruction. Most of the existing digital mural restoration methods have problems such as insufficient feature extraction and loss of detail reconstruction. In this paper, we propose a two-stage coarse-to-fine digital mural restoration framework. The first stage is used to achieve coarse-grained semantic reconstruction, and the second stage is used to achieve fine-grained feature reconstruction. To improve the reconstruction performance of the model, we also designed a new building block that integrates the Swin transformer module (SwinT) and the multi-scale dilated convolution attention module. Meanwhile, the proposed loss function is to further empower the proposed model to repair damaged mural paintings. Extensive comparative experiments show that the proposed model can effectively restore the missing content of the mural and exceed the comparative methods in both quantitative and qualitative evaluation.

Article activity feed