Cross-Lingual Knowledge Transferring by Structural Correspondence and Space Transfer

被引:3
|
作者
Wang, Deqing [1 ]
Wu, Junjie [2 ,3 ,4 ]
Yang, Jingyuan [5 ]
Jing, Baoyu [6 ]
Zhang, Wenjie [1 ]
He, Xiaonan [7 ]
Zhang, Hui [1 ]
机构
[1] Beihang Univ, Sch Comp Sci, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Econ & Management, Beijing 100191, Peoples R China
[3] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing 100191, Peoples R China
[4] Beihang Univ, Beijing Key Lab Emergency Support Simulat Technol, Beijing 100191, Peoples R China
[5] George Mason Univ, Sch Business, Fairfax, VA 22030 USA
[6] Univ Illinois, Dept Comp Sci, Champaign, IL 61801 USA
[7] Baidu Inc, Dept Search, Beijing 100094, Peoples R China
关键词
Task analysis; Machine translation; Analytical models; Transfer learning; Dictionaries; Electronic mail; Time complexity; Cross-lingual sentiment classification; space transfer; structural correspondence learning (SCL); SENTIMENT CLASSIFICATION;
D O I
10.1109/TCYB.2021.3051005
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The cross-lingual sentiment analysis (CLSA) aims to leverage label-rich resources in the source language to improve the models of a resource-scarce domain in the target language, where monolingual approaches based on machine learning usually suffer from the unavailability of sentiment knowledge. Recently, the transfer learning paradigm that can transfer sentiment knowledge from resource-rich languages, for example, English, to resource-poor languages, for example, Chinese, has gained particular interest. Along this line, in this article, we propose semisupervised learning with SCL and space transfer (ssSCL-ST), a semisupervised transfer learning approach that makes use of structural correspondence learning as well as space transfer for cross-lingual sentiment analysis. The key idea behind ssSCL-ST, at a high level, is to explore the intrinsic sentiment knowledge in the target-lingual domain and to reduce the loss of valuable knowledge due to the knowledge transfer via semisupervised learning. ssSCL-ST also features in pivot set extension and space transfer, which helps to enhance the efficiency of knowledge transfer and improve the classification accuracy in the target language domain. Extensive experimental results demonstrate the superiority of ssSCL-ST to the state-of-the-art approaches without using any parallel corpora.
引用
收藏
页码:6555 / 6566
页数:12
相关论文
共 50 条
  • [31] Zero-Shot Cross-Lingual Knowledge Transfer in VQA via Multimodal Distillation
    Weng, Yu
    Dong, Jun
    He, Wenbin
    Chaomurilige
    Liu, Xuan
    Liu, Zheng
    Gao, Honghao
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, : 1 - 11
  • [32] Cross-Lingual Knowledge Editing in Large Language Models
    Wang, Jiaan
    Liang, Yunlong
    Sun, Zengkui
    Cao, Yuxuan
    Xu, Jiarong
    Meng, Fandong
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 11676 - 11686
  • [33] Cross-Lingual Knowledge Distillation for Chinese Video Captioning
    Hou J.-Y.
    Qi Y.-Y.
    Wu X.-X.
    Jia Y.-D.
    Jisuanji Xuebao/Chinese Journal of Computers, 2021, 44 (09): : 1907 - 1921
  • [34] Semantic Space Transformations for Cross-Lingual Document Classification
    Martinek, Jiri
    Lenc, Ladislav
    Kral, Pavel
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT I, 2018, 11139 : 608 - 616
  • [35] Cross-Lingual Transfer Learning for Statistical Type Inference
    Li, Zhiming
    Xie, Xiaofei
    Li, Haoliang
    Xu, Zhengzi
    Li, Yi
    Liu, Yang
    PROCEEDINGS OF THE 31ST ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2022, 2022, : 239 - 250
  • [36] mCLIP: Multilingual CLIP via Cross-lingual Transfer
    Chen, Guanhua
    Hou, Lu
    Chen, Yun
    Dai, Wenliang
    Shang, Lifeng
    Jiang, Xin
    Liu, Qun
    Pan, Jia
    Wang, Wenping
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 13028 - 13043
  • [37] Cross-Lingual Transfer Learning Framework for Program Analysis
    Li, Zhiming
    2021 36TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING ASE 2021, 2021, : 1074 - 1078
  • [38] Unsupervised Cross-lingual Transfer of Word Embedding Spaces
    Xu, Ruochen
    Yang, Yiming
    Otani, Naoki
    Wu, Yuexin
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 2465 - 2474
  • [39] Cross-Lingual Sentiment Relation Capturing for Cross-Lingual Sentiment Analysis
    Chen, Qiang
    Li, Wenjie
    Lei, Yu
    Liu, Xule
    Luo, Chuwei
    He, Yanxiang
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2017, 2017, 10193 : 54 - 67
  • [40] Investigating the Potential of Task Arithmetic for Cross-Lingual Transfer
    Parovic, Marinela
    Vulic, Ivan
    Korhonen, Anna
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 124 - 137