Neural Language Models and Few Shot Learning for Systematic Requirements Processing in MDSE

被引:4
|
作者
Bertram, Vincent [1 ]
Boss, Miriam [1 ]
Kusmenko, Evgeny [1 ]
Nachmann, Imke Helene [1 ]
Rumpe, Bernhard [1 ]
Trotta, Danilo [1 ]
Wachtmeister, Louis [1 ]
机构
[1] Rhein Westfal TH Aachen, Aachen, Germany
关键词
model-driven requirements engineering; fewshot learning; natural language processing;
D O I
10.1145/3567512.3567534
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Systems engineering, in particular in the automotive domain, needs to cope with the massively increasing numbers of requirements that arise during the development process. The language in which requirements are written is mostly informal and highly individual. This hinders automated processing of requirements as well as the linking of requirements to models. Introducing formal requirement notations in existing projects leads to the challenge of translating masses of requirements and the necessity of training for requirements engineers. In this paper, we derive domain-specific language constructs helping us to avoid ambiguities in requirements and increase the level of formality. The main contribution is the adoption and evaluation of few-shot learning with large pretrained language models for the automated translation of informal requirements to structured languages such as a requirement DSL.
引用
收藏
页码:260 / 265
页数:6
相关论文
共 50 条
  • [41] Low-shot Learning in Natural Language Processing
    Xia, Congying
    Zhang, Chenwei
    Zhang, Jiawei
    Liang, Tingting
    Peng, Hao
    Yu, Philip S.
    2020 IEEE SECOND INTERNATIONAL CONFERENCE ON COGNITIVE MACHINE INTELLIGENCE (COGMI 2020), 2020, : 185 - 189
  • [42] FLamE: Few-shot Learning from Natural Language Explanations
    Zhou, Yangqiaoyu
    Zhang, Yiming
    Tan, Chenhao
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 6743 - 6763
  • [43] VL-Few: Vision Language Alignment for Multimodal Few-Shot Meta Learning
    Ma, Han
    Fan, Baoyu
    Ng, Benjamin K.
    Lam, Chan-Tong
    APPLIED SCIENCES-BASEL, 2024, 14 (03):
  • [44] The Goldilocks paradigm: comparing classical machine learning, large language models, and few-shot learning for drug discovery applications
    Snyder, Scott H.
    Vignaux, Patricia A.
    Ozalp, Mustafa Kemal
    Gerlach, Jacob
    Puhl, Ana C.
    Lane, Thomas R.
    Corbett, John
    Urbina, Fabio
    Ekins, Sean
    COMMUNICATIONS CHEMISTRY, 2024, 7 (01):
  • [45] Stabilized In-Context Learning with Pre-trained Language Models for Few Shot Dialogue State Tracking
    Chen, Derek
    Qian, Kun
    Yu, Zhou
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1551 - 1564
  • [46] Investigating Prompt Learning for Chinese Few-Shot Text Classification with Pre-Trained Language Models
    Song, Chengyu
    Shao, Taihua
    Lin, Kejing
    Liu, Dengfeng
    Wang, Siyuan
    Chen, Honghui
    APPLIED SCIENCES-BASEL, 2022, 12 (21):
  • [47] Graph Neural Networks With Triple Attention for Few-Shot Learning
    Cheng, Hao
    Zhou, Joey Tianyi
    Tay, Wee Peng
    Wen, Bihan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 8225 - 8239
  • [48] MetaNODE: Prototype Optimization as a Neural ODE for Few-Shot Learning
    Zhang, Baoquan
    Li, Xutao
    Feng, Shanshan
    Ye, Yunming
    Ye, Rui
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 9014 - 9021
  • [49] Local feature graph neural network for few-shot learning
    Weng P.
    Dong S.
    Ren L.
    Zou K.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (04) : 4343 - 4354
  • [50] Neural representational geometry underlies few-shot concept learning
    Sorscher, Ben
    Ganguli, Surya
    Sompolinsky, Haim
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2022, 119 (43)