RAST: Restorable Arbitrary Style Transfer

被引:2
|
作者
Ma, Yingnan [1 ]
Zhao, Chenqiu [1 ]
Huang, Bingran [1 ]
Li, Xudong [1 ]
Basu, Anup [1 ]
机构
[1] Univ Alberta, Edmonton, AB, Canada
关键词
Neural style transfer; multi-restorations; style difference;
D O I
10.1145/3638770
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The objective of arbitrary style transfer is to apply a given artistic or photo-realistic style to a target image. Although current methods have shown some success in transferring style, arbitrary style transfer still has several issues, including content leakage. Embedding an artistic style can result in unintended changes to the image content. This article proposes an iterative framework called Restorable Arbitrary Style Transfer (RAST) to effectively ensure content preservation and mitigate potential alterations to the content information. RAST can transmit both content and style information through multi-restorations and balance the content-style tradeoff in stylized images using the image restoration accuracy. To ensure RAST's effectiveness, we introduce two novel loss functions: multi-restoration loss and style difference loss. We also propose a new quantitative evaluation method to assess content preservation and style embedding performance. Experimental results show that RAST outperforms state-of-the-art methods in generating stylized images that preserve content and embed style accurately.
引用
收藏
页数:21
相关论文
共 50 条
  • [31] ARBITRARY STYLE TRANSFER USING GRAPH INSTANCE NORMALIZATION
    Jung, Dongki
    Yang, Seunghan
    Choi, Jaehoon
    Kim, Changick
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 1596 - 1600
  • [32] QR code arbitrary style transfer algorithm based on style matching layer
    Hai-Sheng Li
    Jingyin Chen
    Huafeng Huang
    Multimedia Tools and Applications, 2024, 83 : 38505 - 38522
  • [33] QR code arbitrary style transfer algorithm based on style matching layer
    Li, Hai-Sheng
    Chen, Jingyin
    Huang, Huafeng
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (13) : 38505 - 38522
  • [34] Rethink arbitrary style transfer with transformer and contrastive learning
    Zhang, Zhanjie
    Sun, Jiakai
    Li, Guangyuan
    Zhao, Lei
    Zhang, Quanwei
    Lan, Zehua
    Yin, Haolin
    Xing, Wei
    Lin, Huaizhong
    Zuo, Zhiwen
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 241
  • [35] Progressive Attentional Manifold Alignment for Arbitrary Style Transfer
    Luo, Xuan
    Han, Zhen
    Yang, Linkang
    COMPUTER VISION - ACCV 2022, PT VII, 2023, 13847 : 134 - 150
  • [36] Deep Content Guidance Network for Arbitrary Style Transfer
    Shi, Di-Bo
    Xie, Huan
    Ji, Yi
    Li, Ying
    Liu, Chun-Ping
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [37] Panoramic Arbitrary Style Transfer with Deformable Distortion Constraints
    Ye, Wujian
    Wang, Yue
    Liu, Yijun
    Lin, Wenjie
    Xiang, Xin
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2025, 106
  • [38] StyleFormer: Real-time Arbitrary Style Transfer via Parametric Style Composition
    Wu, Xiaolei
    Hu, Zhihao
    Sheng, Lu
    Xu, Dong
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 14598 - 14607
  • [39] CSAST: Content self-supervised and style contrastive learning for arbitrary style transfer
    Zhang, Yuqi
    Tian, Yingjie
    Hou, Junjie
    NEURAL NETWORKS, 2023, 164 : 146 - 155
  • [40] Arbitrary style transfer method with attentional feature distribution matching
    Ge, Bin
    Hu, Zhenshan
    Xia, Chenxing
    Guan, Junming
    MULTIMEDIA SYSTEMS, 2024, 30 (02)