Explainable natural language processing with matrix product states

被引:3
|
作者
Tangpanitanon, Jirawat [1 ,2 ]
Mangkang, Chanatip [3 ]
Bhadola, Pradeep [4 ]
Minato, Yuichiro [5 ]
Angelakis, Dimitris G. [6 ,7 ]
Chotibut, Thiparat [3 ]
机构
[1] Quantum Technol Fdn Thailand, Bangkok, Thailand
[2] Minist Higher Educ Sci Res & Innovat, Thailand Ctr Excellence Phys, Bangkok, Thailand
[3] Chulalongkorn Univ, Fac Sci, Dept Phys, Chula Intelligent & Complex Syst, Bangkok, Thailand
[4] Mahidol Univ, Ctr Theoret Phys & Nat Philosophy, Nakhonsawan Studiorum Adv Studies, Nakhonsawan Campus, Khao Thong, Thailand
[5] Blueqat Inc, Tokyo, Japan
[6] Tech Univ Crete, Sch Elect & Comp Engn, Khania, Greece
[7] Natl Univ Singapore, Ctr Quantum Technol, Singapore, Singapore
来源
NEW JOURNAL OF PHYSICS | 2022年 / 24卷 / 05期
关键词
matrix product state; entanglement entropy; entanglement spectrum; quantum machine learning; natural language processing; recurrent neural networks; TENSOR NETWORKS; QUANTUM;
D O I
10.1088/1367-2630/ac6232
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Despite empirical successes of recurrent neural networks (RNNs) in natural language processing (NLP), theoretical understanding of RNNs is still limited due to intrinsically complex non-linear computations. We systematically analyze RNNs' behaviors in a ubiquitous NLP task, the sentiment analysis of movie reviews, via the mapping between a class of RNNs called recurrent arithmetic circuits (RACs) and a matrix product state. Using the von-Neumann entanglement entropy (EE) as a proxy for information propagation, we show that single-layer RACs possess a maximum information propagation capacity, reflected by the saturation of the EE. Enlarging the bond dimension beyond the EE saturation threshold does not increase model prediction accuracies, so a minimal model that best estimates the data statistics can be inferred. Although the saturated EE is smaller than the maximum EE allowed by the area law, our minimal model still achieves similar to 99% training accuracies in realistic sentiment analysis data sets. Thus, low EE is not a warrant against the adoption of single-layer RACs for NLP. Contrary to a common belief that long-range information propagation is the main source of RNNs' successes, we show that single-layer RACs harness high expressiveness from the subtle interplay between the information propagation and the word vector embeddings. Our work sheds light on the phenomenology of learning in RACs, and more generally on the explainability of RNNs for NLP, using tools from many-body quantum physics.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Gaussian matrix product states
    Schuch, Norbert
    Wolf, Michael M.
    Cirac, J. Ignacio
    QUANTUM INFORMATION AND MANY BODY QUANTUM SYSTEMS, PROCEEDINGS, 2008, 8 : 129 - 142
  • [32] Stochastic Matrix Product States
    Temme, Kristan
    Verstraete, Frank
    PHYSICAL REVIEW LETTERS, 2010, 104 (21)
  • [33] Testing matrix product states
    Soleimanifar, Mehdi
    Wright, John
    PROCEEDINGS OF THE 2022 ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, SODA, 2022, : 1679 - 1701
  • [34] Explainable multimodal prediction of treatment-resistance in patients with depression leveraging brain morphometry and natural language processing
    Kim, Narae
    Kim, Narae
    Park, Chulhyoung
    Gan, Sujin
    Son, Sang Joon
    Park, Rae Woong
    Park, Bumhee
    PSYCHIATRY RESEARCH, 2024, 334
  • [35] S Matrix from Matrix Product States
    Vanderstraeten, Laurens
    Haegeman, Jutho
    Osborne, Tobias J.
    Verstraete, Frank
    PHYSICAL REVIEW LETTERS, 2014, 112 (25)
  • [36] Explainable Automatic Industrial Carbon Footprint Estimation From Bank Transaction Classification Using Natural Language Processing
    Gonzalez-Gonzalez, Jaime
    Garcia-Mendez, Silvia
    De Arriba-Perez, Francisco
    Gonzalez-Castano, Francisco J.
    Barba-Seara, Oscar
    IEEE ACCESS, 2022, 10 : 126326 - 126338
  • [37] NEW TRENDS IN NATURAL-LANGUAGE PROCESSING - STATISTICAL NATURAL-LANGUAGE PROCESSING
    MARCUS, M
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 1995, 92 (22) : 10052 - 10059
  • [38] Introduction to Chinese Natural Language Processing (Review of Introduction to Chinese Natural Language Processing)
    Jiang Song
    JOURNAL OF TECHNOLOGY AND CHINESE LANGUAGE TEACHING, 2010, 1 (01): : 94 - 98
  • [39] NLP (Natural Language Processing) for NLP (Natural Language Programming)
    Mihalcea, R
    Liu, H
    Lieberman, H
    COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, 2006, 3878 : 319 - 330
  • [40] Can Natural Language Processing Become Natural Language Coaching?
    Hearst, Marti A.
    PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 1245 - 1252