Arbitrary Portuguese text style transfer

被引:0
|
作者
da Costa, Pablo Botton [1 ]
Paraboni, Ivandre [1 ]
机构
[1] Univ Sao Paulo, EACH, Sao Paulo, Brazil
来源
LINGUAMATICA | 2023年 / 15卷 / 02期
基金
巴西圣保罗研究基金会;
关键词
natural language generation; arbitrary style transfer; paraphrases; sequence-to-sequence; large language models;
D O I
10.21814/lm.15.2.410
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
In Automatic Natural Language Generation, arbitrary style transfer models aim to rewrite a text using any desired new set of stylistic features. In the case of the Portuguese language, however, we notice that the resources required for the development of models of this type are still considerably scarce compared to those dedicated to the English language. Thus, as a first step towards the development of advanced methods of this kind, the present work investigates the issue of arbitrary style transfer with the aid of paraphrases in Portuguese, combined with the use of neural models built from sequence-to-sequence architectures and by refining a number of large language models. In addition to the textual rewriting models themselves, the study also presents novel resources for the task in the form of a corpus of paraphrases and a model of embeddings validated in both sentence similarity and simplification tasks, with results comparable to the state of the art.
引用
收藏
页码:19 / 36
页数:18
相关论文
共 50 条
  • [41] Text Style Transfer via Optimal Transport
    Nouri, Nasim
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 2532 - 2541
  • [42] Text Style Transfer Back-Translation
    Wei, Daimeng
    Wu, Zhanglin
    Shang, Hengchao
    Li, Zongyao
    Wang, Minghan
    Guo, Jiaxin
    Chen, Xiaoyu
    Yu, Zhengzhe
    Yang, Hao
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 7944 - 7959
  • [43] Deep Learning for Text Style Transfer: A Survey
    Jin, Di
    Jin, Zhijing
    Hu, Zhiting
    Vechtomova, Olga
    Mihalcea, Rada
    COMPUTATIONAL LINGUISTICS, 2022, 48 (01) : 155 - 205
  • [44] Synthesizing data for text recognition with style transfer
    Li, Jiahui
    Wang, Siwei
    Wang, Yongtao
    Tang, Zhi
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (20) : 29183 - 29196
  • [45] Typography with Decor: Intelligent Text Style Transfer
    Wang, Wenjing
    Liu, Jiaying
    Yang, Shuai
    Guo, Zongming
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 5882 - 5890
  • [46] Transductive Learning for Unsupervised Text Style Transfer
    Xiao, Fei
    Pang, Liang
    Lan, Yanyan
    Wang, Yan
    Shen, Huawei
    Cheng, Xueqi
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2510 - 2521
  • [47] Polite Chatbot: A Text Style Transfer Application
    Mukherjee, Sourabrata
    Hudecek, Vojtech
    Dusek, Ondeej
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 87 - 93
  • [48] On Learning Text Style Transfer with Direct Rewards
    Liu, Yixin
    Neubig, Graham
    Wieting, John
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 4262 - 4273
  • [49] MSSRNet: Manipulating Sequential Style Representation for Unsupervised Text Style Transfer
    Yang, Yazheng
    Zhao, Zhou
    Liu, Qi
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 3022 - 3032
  • [50] Text Style Transfer: Leveraging a Style Classifier on Entangled Latent Representations
    Li, Xiaoyan
    Sun, Sun
    Wang, Yunli
    REPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2021, : 72 - 82