Investor's ESG tendency probed by pre-trained transformers

被引:0
|
作者
Li, Chao
Keeley, Alexander Ryota
Takeda, Shutaro
Seki, Daikichi
Managi, Shunsuke [1 ]
机构
[1] Kyushu Univ, 744 Motooka,Nishi Ku, Fukuoka 8190395, Japan
关键词
data mining; ESG; investor; machine learning; natural language processing; pre-trained transformer; CORPORATE SOCIAL-RESPONSIBILITY; PREMATURE MORTALITY; TEXTUAL ANALYSIS; SUSTAINABILITY; POLLUTION;
D O I
10.1002/csr.3055
中图分类号
F [经济];
学科分类号
02 ;
摘要
Due to climate change and social issues, environmental, social, and governance (ESG) solutions receive increased attention and emphasis. Being influential market leaders, investors wield significant power to persuade companies to prioritize ESG considerations. However, investors' preferences for specific ESG topics and changing trends in those preferences remain elusive. Here, we build a group of large language models with 128 million parameters, named classification pre-trained transformers (CPTs), to extract the investors' tendencies toward 13 ESG-related topics from their annual reports. Assisted by the CPT models with approximately 95% cross-validation accuracy, more than 3000 annual reports released by globally 350 top investors during 2010-2021 are analyzed. Results indicate that although the investors show the strongest tendency toward the economic aspect in their annual reports, the emphasis is gradually reducing and shifting to environmental and social aspects. Nonfinancial investors like corporation and holding company investors prioritize environmental and social factors, whereas financial investors pay the most attention to governance risk. There are differences in patterns at the country level, for instance, Japan's investors show a greater focus on environmental and social factors than other major countries. Our findings suggest that investors are increasingly valuing sustainability in their decision-making. Different investor businesses may encounter unique ESG challenges, necessitating individualized strategies. Companies should improve their ESG disclosures, which are increasingly focused on environmental and social issues, to meet investor expectations and bolster transparency.
引用
收藏
页码:2051 / 2071
页数:21
相关论文
共 50 条
  • [1] Are Pre-trained Convolutions Better than Pre-trained Transformers?
    Tay, Yi
    Dehghani, Mostafa
    Gupta, Jai
    Aribandi, Vamsi
    Bahri, Dara
    Qin, Zhen
    Metzler, Donald
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4349 - 4359
  • [2] Calibration of Pre-trained Transformers
    Desai, Shrey
    Durrett, Greg
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 295 - 302
  • [3] Emergent Modularity in Pre-trained Transformers
    Zhang, Zhengyan
    Zeng, Zhiyuan
    Lin, Yankai
    Xiao, Chaojun
    Wang, Xiaozhi
    Han, Xu
    Liu, Zhiyuan
    Xie, Ruobing
    Sun, Maosong
    Zhou, Jie
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 4066 - 4083
  • [4] Pre-trained transformers: an empirical comparison
    Casola, Silvia
    Lauriola, Ivano
    Lavelli, Alberto
    MACHINE LEARNING WITH APPLICATIONS, 2022, 9
  • [5] Face Inpainting with Pre-trained Image Transformers
    Gonc, Kaan
    Saglam, Baturay
    Kozat, Suleyman S.
    Dibeklioglu, Hamdi
    2022 30TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2022,
  • [6] How Different are Pre-trained Transformers for Text Ranking?
    Rau, David
    Kamps, Jaap
    ADVANCES IN INFORMATION RETRIEVAL, PT II, 2022, 13186 : 207 - 214
  • [7] Predicting Terms in IS-A Relations with Pre-trained Transformers
    Nikishina, Irina
    Chernomorchenko, Polina
    Demidova, Anastasiia
    Panchenko, Alexander
    Biemann, Chris
    13TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING AND THE 3RD CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, IJCNLP-AACL 2023, 2023, : 134 - 148
  • [8] Efficient feature selection for pre-trained vision transformers
    Huang, Lan
    Zeng, Jia
    Yu, Mengqiang
    Ding, Weiping
    Bai, Xingyu
    Wang, Kangping
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2025, 254
  • [9] Generative pre-trained transformers (GPT) for surface engineering
    Kamnis, Spyros
    SURFACE & COATINGS TECHNOLOGY, 2023, 466
  • [10] Generating Extended and Multilingual Summaries with Pre-trained Transformers
    Calizzano, Remi
    Ostendorff, Malte
    Ruan, Qian
    Rehm, Georg
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 1640 - 1650