Investor's ESG tendency probed by pre-trained transformers

被引:0
|
作者
Li, Chao
Keeley, Alexander Ryota
Takeda, Shutaro
Seki, Daikichi
Managi, Shunsuke [1 ]
机构
[1] Kyushu Univ, 744 Motooka,Nishi Ku, Fukuoka 8190395, Japan
关键词
data mining; ESG; investor; machine learning; natural language processing; pre-trained transformer; CORPORATE SOCIAL-RESPONSIBILITY; PREMATURE MORTALITY; TEXTUAL ANALYSIS; SUSTAINABILITY; POLLUTION;
D O I
10.1002/csr.3055
中图分类号
F [经济];
学科分类号
02 ;
摘要
Due to climate change and social issues, environmental, social, and governance (ESG) solutions receive increased attention and emphasis. Being influential market leaders, investors wield significant power to persuade companies to prioritize ESG considerations. However, investors' preferences for specific ESG topics and changing trends in those preferences remain elusive. Here, we build a group of large language models with 128 million parameters, named classification pre-trained transformers (CPTs), to extract the investors' tendencies toward 13 ESG-related topics from their annual reports. Assisted by the CPT models with approximately 95% cross-validation accuracy, more than 3000 annual reports released by globally 350 top investors during 2010-2021 are analyzed. Results indicate that although the investors show the strongest tendency toward the economic aspect in their annual reports, the emphasis is gradually reducing and shifting to environmental and social aspects. Nonfinancial investors like corporation and holding company investors prioritize environmental and social factors, whereas financial investors pay the most attention to governance risk. There are differences in patterns at the country level, for instance, Japan's investors show a greater focus on environmental and social factors than other major countries. Our findings suggest that investors are increasingly valuing sustainability in their decision-making. Different investor businesses may encounter unique ESG challenges, necessitating individualized strategies. Companies should improve their ESG disclosures, which are increasingly focused on environmental and social issues, to meet investor expectations and bolster transparency.
引用
收藏
页码:2051 / 2071
页数:21
相关论文
共 50 条
  • [41] EAPT: An encrypted traffic classification model via adversarial pre-trained transformers
    Zhan, Mingming
    Yang, Jin
    Jia, Dongqing
    Fu, Geyuan
    COMPUTER NETWORKS, 2025, 257
  • [42] Detecting Propaganda Techniques in English News Articles using Pre-trained Transformers
    Abdullah, Malak
    Altiti, Ola
    Obiedat, Rasha
    2022 13TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION SYSTEMS (ICICS), 2022, : 301 - 308
  • [43] CopiFilter: An Auxiliary Module Adapts Pre-trained Transformers for Medical Dialogue Summarization
    Duan, Jiaxin
    Liu, Junfei
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 99 - 114
  • [44] Math-LLMs: AI Cyberinfrastructure with Pre-trained Transformers for Math Education
    Zhang, Fan
    Li, Chenglu
    Henkel, Owen
    Xing, Wanli
    Baral, Sami
    Heffernan, Neil
    Li, Hai
    INTERNATIONAL JOURNAL OF ARTIFICIAL INTELLIGENCE IN EDUCATION, 2024,
  • [45] Towards a Comprehensive Understanding and Accurate Evaluation of Societal Biases in Pre-Trained Transformers
    Silva, Andrew
    Tambwekar, Pradyumna
    Gombolay, Matthew
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2383 - 2389
  • [46] Harnessing Generative Pre-Trained Transformers for Construction Accident Prediction with Saliency Visualization
    Yoo, Byunghee
    Kim, Jinwoo
    Park, Seongeun
    Ahn, Changbum R.
    Oh, Taekeun
    APPLIED SCIENCES-BASEL, 2024, 14 (02):
  • [47] BERT-QPP: Contextualized Pre-trained Transformers for Query Performance Prediction
    Arabzadeh, Negar
    Khodabakhsh, Maryam
    Bagheri, Ebrahim
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2857 - 2861
  • [48] PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation
    Hua, Xinyu
    Wang, Lu
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 781 - 793
  • [49] Learning to Switch off, Switch on, and Integrate Modalities in Large Pre-trained Transformers
    Duseja, Tejas
    Annervaz, K. M.
    Duggani, Jeevithiesh
    Zacharia, Shyam
    Free, Michael
    Dukkipati, Ambedkar
    2024 IEEE 7TH INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL, MIPR 2024, 2024, : 403 - 409
  • [50] ProdRev: A DNN framework for empowering customers using generative pre-trained transformers
    Gupta, Aakash
    Das, Nataraj
    2022 INTERNATIONAL CONFERENCE ON DECISION AID SCIENCES AND APPLICATIONS (DASA), 2022, : 895 - 899