Revitalizing Art with Technology: A Deep Learning Approach to Virtual Restoration
DOI:
https://doi.org/10.14421/jiska.2025.10.1.87-99Keywords:
Art Restoration, CycleGAN, Deep LearningAbstract
This study evaluates CycleGAN's performance in virtual painting restoration, focusing on color restoration and detail reproduction. We compiled datasets categorized by art styles and conditions to achieve accurate restorations without altering original reference materials. Various paintings were degraded, including those with a yellow filter, to create effective training datasets for CycleGAN. The model utilized cycle consistency loss and advanced data augmentation techniques. We assessed the results using PSNR, SSIM, and Color Inspector metrics, focusing on Claude Monet's Nasturtiums in a Blue Vase and Hermann Corrodi's Prayers at Dawn. The findings demonstrate superior color recovery and preservation of intricate details compared to other methods, confirmed through quantitative and qualitative evaluations. Key contributions include employing CycleGAN for art restoration, model evaluation, and framework development. Practical implications extend to art conservation, digital library enhancement, art education, and broader access to restored works. Future research may explore dataset expansion, complex architectures, interdisciplinary collaboration, automated evaluation tools, and improved technologies for real-time restoration applications. In conclusion, CycleGAN holds promise for digital art conservation, with ongoing efforts aimed at its integration across fields for effective cultural preservation.
References
Amiri, M. M., & Messinger, D. W. (2021). Virtual cleaning of works of art using deep convolutional neural networks. Heritage Science, 1–19. https://doi.org/10.1186/s40494-021-00567-4
Engin, D. (2018). Cycle-Dehaze : Enhanced CycleGAN for Single Image Dehazing. 938–946. https://doi.org/10.1109/CVPRW.2018.00127
Farajzadeh, N., & Hashemzadeh, M. (2021). A deep neural network based framework for restoring the damaged persian pottery via digital inpainting. Journal of Computational Science, 56(January), 101486. https://doi.org/10.1016/j.jocs.2021.101486
Kumar, P., & Gupta, V. (2023). Unpaired Image-to-Image Translation Based Artwork Restoration Using Generative Adversarial Networks. Smart Innovation, Systems and Technologies, 372, 581–591. https://doi.org/10.1007/978-981-99-6774-2_52
Maali Amiri, M., & Messinger, D. W. (2023). Virtual cleaning of works of art using a deep generative network: spectral reflectance estimation. Heritage Science, 11(1), 1–15. https://doi.org/10.1186/s40494-023-00859-x
Pietroni, E., & Ferdani, D. (2021). Virtual restoration and virtual reconstruction in cultural heritage: Terminology, methodologies, visual representation techniques and cognitive models. Information (Switzerland), 12(4). https://doi.org/10.3390/info12040167
Sizyakin, R., Voronin, V. V., & Pizurica, A. (2022). Virtual restoration of paintings based on deep learning. 60. https://doi.org/10.1117/12.2624371
Wan, Z., Zhang, B., Chen, D., Zhang, P., Chen, D., Liao, J., & Wen, F. (2020). Bringing Old Photos Back to Life. Computer Vision and Pattern Recognition. https://doi.org/https://doi.org/10.48550/arXiv.2004.09484
Wang, H., Chen, Y., Chen, K., Lin, X., & Lee, M. (2018). Dunhuang Mural Restoration using Deep Learning. https://doi.org/10.1145/3283254.3283263
Wang, J., Zhang, E., Cui, S., Wang, J., Zhang, Q., Fan, J., & Peng, J. (2023). GGD-GAN: Gradient-Guided dual-Branch adversarial networks for relic sketch generation. Pattern Recognition, 141, 109586. https://doi.org/10.1016/j.patcog.2023.109586
Wu, Y., Wang, X., Li, Y., Zhang, H., Zhao, X., & Shan, Y. (2021). Towards Vivid and Diverse Image Colorization with Generative Color Prior. Proceedings of the IEEE International Conference on Computer Vision, Iccv, 14357–14366. https://doi.org/10.1109/ICCV48922.2021.01411
Xiao, Y., Jiang, A., Liu, C., & Wang, M. (2019). SINGLE IMAGE COLORIZATION VIA MODIFIED CYCLEGAN Yuxuan Xiao Mingwen Wang School of Computer and Information Engineering , Jiangxi Normal University , * Corresponding Author : jiangaiwen@jxnu.edu.cn. 2019 IEEE International Conference on Image Processing (ICIP), 3247–3251.
Zhu, J., Park, T., Efros, A. A., Ai, B., & Berkeley, U. C. (n.d.). Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks.
Zhu, J. Y., Park, T., Isola, P., & Efros, A. A. (2017). Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks. Proceedings of the IEEE International Conference on Computer Vision, 2017-Octob, 2242–2251. https://doi.org/10.1109/ICCV.2017.244
Zou, Z., Zhao, P., & Zhao, X. (2021). Advanced Engineering Informatics Virtual restoration of the colored paintings on weathered beams in the Forbidden City using multiple deep learning algorithms. Advanced Engineering Informatics, 50(September), 101421. https://doi.org/10.1016/j.aei.2021.101421
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Nurrohmah Endah Putranti, Shyang-Jye Chang, Muhammad Raffiudin

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Authors who publish with this journal agree to the following terms as stated in http://creativecommons.org/licenses/by-nc/4.0
a. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.