Cross-Lingual Knowledge Transferring by Structural Correspondence and Space Transfer

被引:3
|
作者
Wang, Deqing [1 ]
Wu, Junjie [2 ,3 ,4 ]
Yang, Jingyuan [5 ]
Jing, Baoyu [6 ]
Zhang, Wenjie [1 ]
He, Xiaonan [7 ]
Zhang, Hui [1 ]
机构
[1] Beihang Univ, Sch Comp Sci, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Econ & Management, Beijing 100191, Peoples R China
[3] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing 100191, Peoples R China
[4] Beihang Univ, Beijing Key Lab Emergency Support Simulat Technol, Beijing 100191, Peoples R China
[5] George Mason Univ, Sch Business, Fairfax, VA 22030 USA
[6] Univ Illinois, Dept Comp Sci, Champaign, IL 61801 USA
[7] Baidu Inc, Dept Search, Beijing 100094, Peoples R China
关键词
Task analysis; Machine translation; Analytical models; Transfer learning; Dictionaries; Electronic mail; Time complexity; Cross-lingual sentiment classification; space transfer; structural correspondence learning (SCL); SENTIMENT CLASSIFICATION;
D O I
10.1109/TCYB.2021.3051005
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The cross-lingual sentiment analysis (CLSA) aims to leverage label-rich resources in the source language to improve the models of a resource-scarce domain in the target language, where monolingual approaches based on machine learning usually suffer from the unavailability of sentiment knowledge. Recently, the transfer learning paradigm that can transfer sentiment knowledge from resource-rich languages, for example, English, to resource-poor languages, for example, Chinese, has gained particular interest. Along this line, in this article, we propose semisupervised learning with SCL and space transfer (ssSCL-ST), a semisupervised transfer learning approach that makes use of structural correspondence learning as well as space transfer for cross-lingual sentiment analysis. The key idea behind ssSCL-ST, at a high level, is to explore the intrinsic sentiment knowledge in the target-lingual domain and to reduce the loss of valuable knowledge due to the knowledge transfer via semisupervised learning. ssSCL-ST also features in pivot set extension and space transfer, which helps to enhance the efficiency of knowledge transfer and improve the classification accuracy in the target language domain. Extensive experimental results demonstrate the superiority of ssSCL-ST to the state-of-the-art approaches without using any parallel corpora.
引用
收藏
页码:6555 / 6566
页数:12
相关论文
共 50 条
  • [41] The Role of Test, Classroom, and Home Language Correspondence in Cross-Lingual Testing
    Alvin Vista
    The Asia-Pacific Education Researcher, 2022, 31 : 711 - 723
  • [42] Cross-Lingual Semantic Role Labeling With Model Transfer
    Fei, Hao
    Zhang, Meishan
    Li, Fei
    Ji, Donghong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 : 2427 - 2437
  • [43] Cross-Lingual Transfer for Hindi Discourse Relation Identification
    Dahiya, Anirudh
    Shrivastava, Manish
    Sharma, Dipti Misra
    TEXT, SPEECH, AND DIALOGUE (TSD 2020), 2020, 12284 : 240 - 247
  • [44] Cross-lingual Structure Transfer for Relation and Event Extraction
    Subburathinam, Ananya
    Lu, Di
    Ji, Heng
    May, Jonathan
    Chang, Shih-Fu
    Sil, Avirup
    Voss, Clare
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 313 - 325
  • [45] Cross-lingual Prosody Transfer for Expressive Machine Dubbing
    Swiatkowski, Jakub
    Wang, Duo
    Babianski, Mikolaj
    Tobing, Patrick Lumban
    Vipperla, Ravichander
    Pollet, Vincent
    INTERSPEECH 2023, 2023, : 4838 - 4842
  • [46] The Role of Test, Classroom, and Home Language Correspondence in Cross-Lingual Testing
    Vista, Alvin
    ASIA-PACIFIC EDUCATION RESEARCHER, 2022, 31 (06): : 711 - 723
  • [47] Gender Bias in Multilingual Embeddings and Cross-Lingual Transfer
    Zhao, Jieyu
    Mukherjee, Subhabrata
    Hosseini, Saghar
    Chang, Kai-Wei
    Awadallah, Ahmed Hassan
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 2896 - 2907
  • [48] Isomorphic Transfer of Syntactic Structures in Cross-Lingual NLP
    Ponti, Edoardo Maria
    Reichart, Roi
    Korhonen, Anna
    Vulic, Ivan
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1531 - 1542
  • [49] On the Role of Parallel Data in Cross-lingual Transfer Learning
    Reid, Machel
    Artetxe, Mikel
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5999 - 6006
  • [50] CROSS-LINGUAL TRANSFER LEARNING FOR SPOKEN LANGUAGE UNDERSTANDING
    Quynh Ngoc Thi Do
    Gaspers, Judith
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5956 - 5960