Multilingual Controllable Transformer-Based Lexical Simplification

被引:0
|
作者
Sheang, Kim Cheng [1 ]
Saggion, Horacio [1 ]
机构
[1] Univ Pompeu Fabra, LaSTUS Grp, TALN Lab, DTIC, Barcelona, Spain
来源
关键词
Multilingual Lexical Simplification; Controllable Lexical Simplification; Text Simplification; Multilinguality;
D O I
10.26342/2023-71-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Text is by far the most ubiquitous source of knowledge and information and should be made easily accessible to as many people as possible; however, texts often contain complex words that hinder reading comprehension and accessibility. Therefore, suggesting simpler alternatives for complex words without compromising meaning would help convey the information to a broader audience. This paper proposes mTLS, a multilingual controllable Transformer-based Lexical Simplification (LS) system fined-tuned with the T5 model. The novelty of this work lies in the use of language-specific prefixes, control tokens, and candidates extracted from pretrained masked language models to learn simpler alternatives for complex words. The evaluation results on three well-known LS datasets - LexMTurk, BenchLS, and NNSEval - show that our model outperforms the previous state-of-the-art models like LSBert and ConLS. Moreover, further evaluation of our approach on the part of the recent TSAR-2022 multilingual LS shared-task dataset shows that our model performs competitively when compared with the participating systems for English LS and even outperforms the GPT-3 model on several metrics. Moreover, our model obtains performance gains also for Spanish and Portuguese.
引用
收藏
页码:109 / 123
页数:15
相关论文
共 50 条
  • [31] Transformer-based Planning for Symbolic Regression
    Shojaee, Parshin
    Meidani, Kazem
    Farimani, Amir Barati
    Reddy, Chandan K.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [32] Transformer-based ripeness segmentation for tomatoes
    Shinoda, Risa
    Kataoka, Hirokatsu
    Hara, Kensho
    Noguchi, Ryozo
    SMART AGRICULTURAL TECHNOLOGY, 2023, 4
  • [33] A transformer-based network for speech recognition
    Tang L.
    International Journal of Speech Technology, 2023, 26 (02) : 531 - 539
  • [34] EEG Classification with Transformer-Based Models
    Sun, Jiayao
    Xie, Jin
    Zhou, Huihui
    2021 IEEE 3RD GLOBAL CONFERENCE ON LIFE SCIENCES AND TECHNOLOGIES (IEEE LIFETECH 2021), 2021, : 92 - 93
  • [35] Transformer-based LLMs for Sensor Data
    Okita, Tsuyoshi
    Ukita, Kosuke
    Matsuishi, Koki
    Kagiyama, Masaharu
    Hirata, Kodai
    Miyazaki, Asahi
    ADJUNCT PROCEEDINGS OF THE 2023 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING & THE 2023 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTING, UBICOMP/ISWC 2023 ADJUNCT, 2023, : 499 - 504
  • [36] Transformer-based approach to variable typing
    Rey, Charles Arthel
    Danguilan, Jose Lorenzo
    Mendoza, Karl Patrick
    Remolona, Miguel Francisco
    HELIYON, 2023, 9 (10)
  • [37] Transformer-Based Video Deinterlacing Method
    Song, Chao
    Li, Haidong
    Zheng, Dong
    Wang, Jie
    Jiang, Zhaoyi
    Yang, Bailin
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT V, 2024, 14451 : 357 - 369
  • [38] Swin transformer-based supervised hashing
    Liangkang Peng
    Jiangbo Qian
    Chong Wang
    Baisong Liu
    Yihong Dong
    Applied Intelligence, 2023, 53 : 17548 - 17560
  • [39] TRANSFORMER-BASED SAR IMAGE DESPECKLING
    Perera, Malsha V.
    Bandara, Wele Gedara Chaminda
    Valanarasu, Jeya Maria Jose
    Patel, Vishal M.
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 751 - 754
  • [40] Transformer-Based Fire Detection in Videos
    Mardani, Konstantina
    Vretos, Nicholas
    Daras, Petros
    SENSORS, 2023, 23 (06)