Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation

被引:0
|
作者
Vyas, Yogarshi [1 ]
Carpuat, Marine [1 ]
机构
[1] Univ Maryland, Dept Comp Sci, College Pk, MD 20742 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Words in different languages rarely cover the exact same semantic space. This work characterizes differences in meaning between words across languages using semantic relations that have been used to relate the meaning of English words. However, because of translation ambiguity, semantic relations are not always preserved by translation. We introduce a cross-lingual relation classifier trained only with English examples and a bilingual dictionary. Our classifier relies on a novel attention-based distillation approach to account for translation ambiguity when transferring knowledge from English to crosslingual settings. On new English-Chinese and English-Hindi test sets, the resulting models largely outperform baselines that more naively rely on bilingual embeddings or dictionaries for cross-lingual transfer, and approach the performance of fully supervised systems on English tasks.
引用
收藏
页码:5285 / 5296
页数:12
相关论文
共 50 条
  • [31] Cross-lingual Annotation Projection of Semantic Roles
    Pado, Sebastian
    Lapata, Mirella
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2009, 36 : 307 - 340
  • [32] Cross-Lingual Classification of Crisis Data
    Khare, Prashant
    Burel, Gregoire
    Maynard, Diana
    Alani, Harith
    SEMANTIC WEB - ISWC 2018, PT I, 2018, 11136 : 617 - 633
  • [33] Cross-Lingual Web Spam Classification
    Garzo, Andras
    Daroczy, Balint
    Kiss, Tamas
    Siklosi, David
    Benczur, Andras A.
    PROCEEDINGS OF THE 22ND INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'13 COMPANION), 2013, : 1149 - 1156
  • [34] Weakly-Supervised Concept-based Adversarial Learning for Cross-lingual Word Embeddings
    Wang, Haozhou
    Henderson, James
    Merlo, Paola
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4419 - 4430
  • [35] DiffSLU: Knowledge Distillation Based Diffusion Model for Cross-Lingual Spoken Language Understanding
    Mao, Tianjun
    Zhang, Chenghong
    INTERSPEECH 2023, 2023, : 715 - 719
  • [36] Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages
    Gupta, Shivanshu
    Matsubara, Yoshitomo
    Chadha, Ankit
    Moschitti, Alessandro
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 14078 - 14092
  • [37] cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation
    Gupta, Kshitij
    Gautam, Devansh
    Mamidi, Radhika
    Proceedings - International Conference on Pattern Recognition, 2022, 2022-August : 1734 - 1741
  • [38] cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation
    Gupta, Kshitij
    Gautam, Devansh
    Mamidi, Radhika
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1734 - 1741
  • [39] Knowledge Distillation Based Training of Universal ASR Source Models for Cross-lingual Transfer
    Fukuda, Takashi
    Thomas, Samuel
    INTERSPEECH 2021, 2021, : 3450 - 3454
  • [40] Chinese-Vietnamese cross-lingual event retrieval method based on knowledge distillation
    Gao S.
    He Z.
    Yu Z.
    Zhu E.
    Wu S.
    Journal of Intelligent and Fuzzy Systems, 2024, 46 (04): : 8461 - 8475