Controlling strokes in fast neural style transfer using content transforms

被引:0
|
作者
Max Reimann
Benito Buchheim
Amir Semmo
Jürgen Döllner
Matthias Trapp
机构
[1] University of Potsdam,Hasso Plattner Institute
[2] DigitalMasterpieces GmbH,undefined
来源
The Visual Computer | 2022年 / 38卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Fast style transfer methods have recently gained popularity in art-related applications as they make a generalized real-time stylization of images practicable. However, they are mostly limited to one-shot stylizations concerning the interactive adjustment of style elements. In particular, the expressive control over stroke sizes or stroke orientations remains an open challenge. To this end, we propose a novel stroke-adjustable fast style transfer network that enables simultaneous control over the stroke size and intensity, and allows a wider range of expressive editing than current approaches by utilizing the scale-variance of convolutional neural networks. Furthermore, we introduce a network-agnostic approach for style-element editing by applying reversible input transformations that can adjust strokes in the stylized output. At this, stroke orientations can be adjusted, and warping-based effects can be applied to stylistic elements, such as swirls or waves. To demonstrate the real-world applicability of our approach, we present StyleTune, a mobile app for interactive editing of neural style transfers at multiple levels of control. Our app allows stroke adjustments on a global and local level. It furthermore implements an on-device patch-based upsampling step that enables users to achieve results with high output fidelity and resolutions of more than 20 megapixels. Our approach allows users to art-direct their creations and achieve results that are not possible with current style transfer applications.
引用
收藏
页码:4019 / 4033
页数:14
相关论文
共 50 条
  • [41] Arbitrary style transfer via content consistency and style consistency
    Yu, Xiaoming
    Zhou, Gan
    VISUAL COMPUTER, 2024, 40 (03): : 1369 - 1382
  • [42] Arbitrary style transfer via content consistency and style consistency
    Xiaoming Yu
    Gan Zhou
    The Visual Computer, 2024, 40 : 1369 - 1382
  • [43] A Unified Framework for Generalizable Style Transfer: Style and Content Separation
    Zhang, Yexun
    Zhang, Ya
    Cai, Wenbin
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 : 4085 - 4098
  • [44] Hybrid camouflage pattern generation using neural style transfer method
    Daneshvar, Elaheh
    Tehran, Mohammad Amani
    Zhang, Yu-Jin
    COLOR RESEARCH AND APPLICATION, 2022, 47 (04): : 878 - 891
  • [45] On the Layer Choice of the Image Style Transfer Using Convolutional Neural Networks
    Lee, Pei-Ying
    Tseng, Chien-Cheng
    2019 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TW), 2019,
  • [46] Augment CAPTCHA Security Using Adversarial Examples With Neural Style Transfer
    Dinh, Nghia
    Tran-Trung, Kiet
    Hoang, Vinh Truong
    IEEE ACCESS, 2023, 11 : 83553 - 83561
  • [47] Painting Style Transfer for Head Portraits using Convolutional Neural Networks
    Selim, Ahmed
    Elgharib, Mohamed
    Doyle, Linda
    ACM TRANSACTIONS ON GRAPHICS, 2016, 35 (04):
  • [48] MaeSTrO: A Mobile App for Style Transfer Orchestration using Neural Networks
    Reimann, Max
    Klingbeil, Mandy
    Pasewaldt, Sebastian
    Semmo, Amir
    Trapp, Matthias
    Doellner, Juergen
    2018 INTERNATIONAL CONFERENCE ON CYBERWORLDS (CW), 2018, : 9 - 16
  • [49] Neural Style Transfer to Design Drapes
    Daru, Pankil
    Gada, Siddhant
    Chheda, Meet
    Raut, Purva
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMPUTING RESEARCH (ICCIC), 2017, : 626 - 631
  • [50] Adversarially robust neural style transfer
    Nakano, Reiichiro
    Distill, 2019, 4 (08):