Large Language Model-based Tools in Language Teaching to Develop Critical Thinking and Sustainable Cognitive Structures

被引:0
|
作者
Joseph, Sindhu [1 ]
机构
[1] Christ Deemed Univ, Bangalore, India
关键词
Critical AI Literacy; English; Language Teaching; Education for Sustainable Development; LITERACY;
D O I
10.21659/rupkatha.v15n4.13
中图分类号
C [社会科学总论];
学科分类号
03 ; 0303 ;
摘要
Experts assert that Large Language Model (LLM) based tools like ChatGPT are the next generation in the evolution of Artificial Intelligence and will permeate all walks of human life including education. The current narrative is that we need to embed the LLM-based tools into the system taking advantage of their personalised, dynamic, adaptive nature while being mindful of their limitations. One of the greatest limitations so far identified is that these pre-trained transformer-based encoder models fine-tuned on Natural Language Processing (NLP) tasks do not reveal verifiable reasoning ability. As a result, the information generated by these tools is subject to ethical and factual errors that need human oversight. This paper uses the integrative literature review to identify and synthesize Critical Digital Literacy frameworks in language teaching in the light of the essential competencies and learning domains identified by the UNESCO Education for Sustainable Development directives. The Critical AI Literacy framework proposed in this paper would enable language teachers to adopt LLM-based tools to enhance their instructional strategies. The cognitive, affective and conative competencies developed through the new CAIL framework would empower learners to understand the manipulative nature of language and use language to build a sustainable future.
引用
收藏
页数:23
相关论文
共 50 条
  • [41] Cognitive Overload: Jailbreaking Large Language Models with Overloaded Logical Thinking
    Xu, Nan
    Wang, Fei
    Zhou, Ben
    Li, Bangzheng
    Xiao, Chaowei
    Chen, Muhao
    Findings of the Association for Computational Linguistics: NAACL 2024 - Findings, 2024, : 3526 - 3548
  • [42] Editorial: Harnessing the Power of Large Language Model-Based Chatbots for Scientific Discovery
    Merz, Kenneth M.
    Wei, Guo-Wei
    Zhu, Feng
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2023, 63 (17) : 5395 - 5395
  • [43] A large language model-based agent for wayfinding: simulation of spatial perception and memory
    Dang, Pei
    Zhu, Jun
    Li, Weilian
    Lai, Jianbo
    CARTOGRAPHY AND GEOGRAPHIC INFORMATION SCIENCE, 2024,
  • [44] A model-based method to develop PLC software for machine tools
    Zaeh, MF
    Poernbacher, C
    CIRP ANNALS-MANUFACTURING TECHNOLOGY, 2005, 54 (01): : 371 - 374
  • [45] Ranked List Truncation for Large Language Model-based Re-Ranking
    Meng, Chuan
    Arabzadeh, Negar
    Askari, Arian
    Aliannejadi, Mohammad
    de Rijke, Maarten
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 141 - 151
  • [46] Design Knowledge as Attention Emphasizer in Large Language Model-Based Sentiment Analysis
    Han, Yi
    Moghaddam, Mohsen
    JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2025, 25 (02)
  • [47] Designing a Large Language Model-Based Coaching Intervention for Lifestyle Behavior Change
    Meywirth, Sophia
    DESIGN SCIENCE RESEARCH FOR A RESILIENT FUTURE, DESRIST 2024, 2024, 14621 : 81 - 94
  • [48] Enhancing video temporal grounding with large language model-based data augmentation
    Tian, Yun
    Guo, Xiaobo
    Wang, Jinsong
    Li, Bin
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (05):
  • [49] An efficient language for model-based trial simulations
    Hu, Shuhua
    Dunlavey, Michael
    Feng, Kairui
    Leary, Robert
    JOURNAL OF PHARMACOKINETICS AND PHARMACODYNAMICS, 2016, 43 : S36 - S37
  • [50] Language model-based retrieval for Farsi documents
    Taghva, K
    Coombs, J
    Pareda, R
    Nartker, T
    ITCC 2004: INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY: CODING AND COMPUTING, VOL 2, PROCEEDINGS, 2004, : 13 - 17