Are ChatGPT and other pretrained language models good parasitologists?

被引:13
|
作者
Slapeta, Jan [1 ,2 ]
机构
[1] Univ Sydney, Fac Sci, Sydney Sch Vet Sci, Sydney, NSW 2006, Australia
[2] Univ Sydney, Inst Infect Dis, Sydney, NSW 2006, Australia
关键词
ECHINOCOCCUS;
D O I
10.1016/j.pt.2023.02.006
中图分类号
R38 [医学寄生虫学]; Q [生物科学];
学科分类号
07 ; 0710 ; 09 ; 100103 ;
摘要
Large language models, such as ChatGPT, will have far-reaching impacts on parasitology, including on students. Authentic experiences gained during students' training are absent from these models. This is not a weakness of the models but rather an opportunity benefiting parasitology at large.
引用
收藏
页码:314 / 316
页数:3
相关论文
共 50 条
  • [1] ChatGPT for good? On opportunities and challenges of large language models for education
    Kasneci, Enkelejda
    Sessler, Kathrin
    Kuechemann, Stefan
    Bannert, Maria
    Dementieva, Daryna
    Fischer, Frank
    Gasser, Urs
    Groh, Georg
    Guennemann, Stephan
    Huellermeier, Eyke
    Krusche, Stepha
    Kutyniok, Gitta
    Michaeli, Tilman
    Nerdel, Claudia
    Pfeffer, Juergen
    Poquet, Oleksandra
    Sailer, Michael
    Schmidt, Albrecht
    Seidel, Tina
    Stadler, Matthias
    Weller, Jochen
    Kuhn, Jochen
    Kasneci, Gjergji
    LEARNING AND INDIVIDUAL DIFFERENCES, 2023, 103
  • [2] A Survey of Pretrained Language Models
    Sun, Kaili
    Luo, Xudong
    Luo, Michael Y.
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, 2022, 13369 : 442 - 456
  • [3] The use of ChatGPT and other large language models in surgical science
    Janssen, Boris, V
    Kazemier, Geert
    Besselink, Marc G.
    BJS OPEN, 2023, 7 (02):
  • [4] Can ChatGPT Truly Overcome Other Large Language Models?
    Ray, Partha
    CANADIAN ASSOCIATION OF RADIOLOGISTS JOURNAL-JOURNAL DE L ASSOCIATION CANADIENNE DES RADIOLOGISTES, 2024, 75 (02): : 429 - 429
  • [5] Geographic Adaptation of Pretrained Language Models
    Hofmann, Valentin
    Glavas, Goran
    Ljubesic, Nikola
    Pierrehumbert, Janet B.
    Schuetze, Hinrich
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 411 - 431
  • [6] Generating Datasets with Pretrained Language Models
    Schick, Timo
    Schuetze, Hinrich
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6943 - 6951
  • [7] Investigating Transferability in Pretrained Language Models
    Tamkin, Alex
    Singh, Trisha
    Giovanardi, Davide
    Goodman, Noah
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1393 - 1401
  • [8] Textually Pretrained Speech Language Models
    Hassid, Michael
    Remez, Tal
    Nguyen, Tu Anh
    Gat, Itai
    Conneau, Alexis
    Kreuk, Felix
    Copet, Jade
    Defossez, Alexandre
    Synnaeve, Gabriel
    Dupoux, Emmanuel
    Schwartz, Roy
    Adi, Yossi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Discourse Probing of Pretrained Language Models
    Koto, Fajri
    Lau, Jey Han
    Baldwin, Timothy
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 3849 - 3864
  • [10] Unsupervised Paraphrasing with Pretrained Language Models
    Niu, Tong
    Yavuz, Semih
    Zhou, Yingbo
    Keskar, Nitish Shirish
    Wang, Huan
    Xiong, Caiming
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 5136 - 5150