Cross-Lingual Knowledge Transferring by Structural Correspondence and Space Transfer

被引:3
|
作者
Wang, Deqing [1 ]
Wu, Junjie [2 ,3 ,4 ]
Yang, Jingyuan [5 ]
Jing, Baoyu [6 ]
Zhang, Wenjie [1 ]
He, Xiaonan [7 ]
Zhang, Hui [1 ]
机构
[1] Beihang Univ, Sch Comp Sci, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Econ & Management, Beijing 100191, Peoples R China
[3] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing 100191, Peoples R China
[4] Beihang Univ, Beijing Key Lab Emergency Support Simulat Technol, Beijing 100191, Peoples R China
[5] George Mason Univ, Sch Business, Fairfax, VA 22030 USA
[6] Univ Illinois, Dept Comp Sci, Champaign, IL 61801 USA
[7] Baidu Inc, Dept Search, Beijing 100094, Peoples R China
关键词
Task analysis; Machine translation; Analytical models; Transfer learning; Dictionaries; Electronic mail; Time complexity; Cross-lingual sentiment classification; space transfer; structural correspondence learning (SCL); SENTIMENT CLASSIFICATION;
D O I
10.1109/TCYB.2021.3051005
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The cross-lingual sentiment analysis (CLSA) aims to leverage label-rich resources in the source language to improve the models of a resource-scarce domain in the target language, where monolingual approaches based on machine learning usually suffer from the unavailability of sentiment knowledge. Recently, the transfer learning paradigm that can transfer sentiment knowledge from resource-rich languages, for example, English, to resource-poor languages, for example, Chinese, has gained particular interest. Along this line, in this article, we propose semisupervised learning with SCL and space transfer (ssSCL-ST), a semisupervised transfer learning approach that makes use of structural correspondence learning as well as space transfer for cross-lingual sentiment analysis. The key idea behind ssSCL-ST, at a high level, is to explore the intrinsic sentiment knowledge in the target-lingual domain and to reduce the loss of valuable knowledge due to the knowledge transfer via semisupervised learning. ssSCL-ST also features in pivot set extension and space transfer, which helps to enhance the efficiency of knowledge transfer and improve the classification accuracy in the target language domain. Extensive experimental results demonstrate the superiority of ssSCL-ST to the state-of-the-art approaches without using any parallel corpora.
引用
收藏
页码:6555 / 6566
页数:12
相关论文
共 50 条
  • [21] Structural Contrastive Pretraining for Cross-Lingual Comprehension
    Chen, Nuo
    Shou, Linjun
    Song, Tengtao
    Gong, Ming
    Pei, Jian
    Chang, Jianhui
    Jiang, Daxin
    Li, Jia
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 2042 - 2057
  • [22] Multilingual Knowledge Graph Embeddings for Cross-lingual Knowledge Alignment
    Chen, Muhao
    Tian, Yingtao
    Yang, Mohan
    Zaniolo, Carlo
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1511 - 1517
  • [23] Metaphor Detection with Cross-Lingual Model Transfer
    Tsvetkov, Yulia
    Boytsov, Leonid
    Gershman, Anatole
    Nyberg, Eric
    Dyer, Chris
    PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, : 248 - 258
  • [24] Cross-Lingual Transfer of Cognitive Processing Complexity
    Pouw, Charlotte
    Hollenstein, Nora
    Beinborn, Lisa
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 655 - 669
  • [25] Distributional Correspondence Indexing for Cross-Lingual and Cross-Domain Sentiment Classification
    Fernandez, Alejandro Moreo
    Esuli, Andrea
    Sebastiani, Fabrizio
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2016, 55 : 131 - 163
  • [26] Improving Low Resource Named Entity Recognition using Cross-lingual Knowledge Transfer
    Feng, Xiaocheng
    Feng, Xiachong
    Qin, Bing
    Feng, Zhangyin
    Liu, Ting
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 4071 - 4077
  • [27] Adaptive Entity Alignment for Cross-Lingual Knowledge Graph
    Zhang, Yuanming
    Gao, Tianyu
    Lu, Jiawei
    Cheng, Zhenbo
    Xiao, Gang
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2021, PT II, 2021, 12816 : 474 - 487
  • [28] Coordinated Reasoning for Cross-Lingual Knowledge Graph Alignment
    Xu, Kun
    Song, Linfeng
    Feng, Yansong
    Song, Yan
    Yu, Dong
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9354 - 9361
  • [29] Knowledge Distillation Based Training of Universal ASR Source Models for Cross-lingual Transfer
    Fukuda, Takashi
    Thomas, Samuel
    INTERSPEECH 2021, 2021, : 3450 - 3454
  • [30] CAKES: Cross-lingual Wikipedia Knowledge Enrichment and Summarization
    Fionda, Valeria
    Pirro, Giuseppe
    20TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE (ECAI 2012), 2012, 242 : 901 - 902