The Journey of Language Models in Understanding Natural Language

被引:0
|
作者
Liu, Yuanrui [1 ,2 ]
Zhou, Jingping [3 ]
Sang, Guobiao [2 ]
Huang, Ruilong [1 ]
Zhao, Xinzhe [1 ]
Fang, Jintao [2 ]
Wang, Tiexin [1 ]
Li, Bohan [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Artificial Intelligence & Comp Sci & Technol, Nanjing 211106, Peoples R China
[2] Beijing Shenzhou Aerosp Software Technol Co Ltd, Beijing 100094, Peoples R China
[3] Beijing Inst Telemetry, Beijing 100083, Peoples R China
关键词
Artificial intelligence; Natural language understanding; Vector space model; Topic model; Neural network; Deep learning; VECTOR-SPACE-MODEL;
D O I
10.1007/978-981-97-7707-5_29
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Since the Turing Test was proposed in the 1950s, humanity began exploring artificial intelligence, with an aim to bridge the interaction gap between machines and human language. This exploration enables machines to comprehend how humans acquire, produce, and understand language, as well as the relationship between linguistic expression and the world. The paper explores the basic principles of natural language representation, the formalization of natural language, and the modeling methods of language models. The paper analyzes, summarizes and compares the mainstream technologies and methods, including vector space-based, topic model-based, graph-based, and neural network-based approaches. And how to improve the development trend and direction of language model understanding ability is predicted and further discussed.
引用
收藏
页码:331 / 363
页数:33
相关论文
共 50 条
  • [31] Visually-Situated Natural Language Understanding with Contrastive Reading Model and Frozen Large Language Models
    Ki, Geewook
    Lee, Hodong
    Kim, Daehee
    Jung, Haeji
    Park, Sanghee
    Kim, Yoonsik
    Yun, Sangdoo
    Kim, Taeho
    Lee, Bado
    Park, Seunghyun
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 11989 - 12010
  • [32] Calibration of Natural Language Understanding Models with Venn-ABERS Predictors
    Giovannotti, Patrizio
    CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, VOL 179, 2022, 179
  • [33] CommonsenseVIS: Visualizing and Understanding Commonsense Reasoning Capabilities of Natural Language Models
    Wang, Xingbo
    Huang, Renfei
    Jin, Zhihua
    Fang, Tianqing
    Qu, Huamin
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2024, 30 (01) : 273 - 283
  • [34] Comparison of alignment templates and maximum entropy models for natural language understanding
    Bender, O
    Macherey, K
    Och, FJ
    Ney, H
    EACL 2003: 10TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, PROCEEDINGS OF THE CONFERENCE, 2003, : 11 - 18
  • [35] Evaluation of Sentence Embedding Models for Natural Language Understanding Problems in Russian
    Popov, Dmitry
    Pugachev, Alexander
    Svyatokum, Polina
    Svitanko, Elizaveta
    Artemova, Ekaterina
    ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS, AIST 2019, 2019, 11832 : 205 - 217
  • [36] Semi-Supervised Learning of Statistical Models for Natural Language Understanding
    Zhou, Deyu
    He, Yulan
    SCIENTIFIC WORLD JOURNAL, 2014,
  • [37] Backdoor Learning of Language Models in Natural Language Processing
    University of Michigan
    1600,
  • [38] Natural language processing in the era of large language models
    Zubiaga, Arkaitz
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2024, 6
  • [39] Structuring Natural Language Requirements with Large Language Models
    Norheim, Johannes J.
    Rebentisch, Eric
    32ND INTERNATIONAL REQUIREMENTS ENGINEERING CONFERENCE WORKSHOPS, REW 2024, 2024, : 68 - 71
  • [40] Enhancing Language Representation with Constructional Information for Natural Language Understanding
    Xu, Lvxiaowei
    Wu, Jianwang
    Peng, Jiawei
    Gong, Zhilin
    Cai, Ming
    Wang, Tianxiang
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 4685 - 4705