pp. 1343-1352
S&M2538 Research Paper of Special Issue https://doi.org/10.18494/SAM.2021.3172 Published: April 14, 2021 Rapid Local Image Style Transfer Method Based on Residual Convolutional Neural Network [PDF] Liming Huang, Ping Wang, Cheng-Fu Yang, and Hsien-Wei Tseng (Received October 21, 2020; Accepted February 2, 2021) Keywords: image style transfer, residual neural network, semantic segmentation, DeepLab2, convolutional neural network
The technology of image style transfer can learn the style of a target image in a fully automated or semi-automated way, which is often very difficult to achieve by manual methods, thus saving much time and improving production efficiency. With the rapid spread of commercial software applications such as beauty selfie apps and short entertainment videos such as TikTok, local image style transfer and its generation speed of images are becoming increasingly important, particularly when these recreational products have features especially valued by users. We propose an algorithm that involves semantic segmentations and residual networks and uses VGG16 for feature extraction to improve the efficiency of local image style transfer and its generation speed, and our experiments prove that the proposed method is more useful than other common methods. The investigated technology can be applied in many specific areas, such as the beauty camera of smart phones, computer-generated imagery in advertisements and movies, computed tomography images, nuclear magnetic resonance imaging of cancer diagnosis under harsh conditions, and virtual simulation in industry design.
Corresponding author: Ping Wang, Cheng-Fu YangThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Liming Huang, Ping Wang, Cheng-Fu Yang, and Hsien-Wei Tseng, Rapid Local Image Style Transfer Method Based on Residual Convolutional Neural Network, Sens. Mater., Vol. 33, No. 4, 2021, p. 1343-1352. |