Investor's ESG tendency probed by pre-trained transformers

被引:0
|
作者
Li, Chao
Keeley, Alexander Ryota
Takeda, Shutaro
Seki, Daikichi
Managi, Shunsuke [1 ]
机构
[1] Kyushu Univ, 744 Motooka,Nishi Ku, Fukuoka 8190395, Japan
关键词
data mining; ESG; investor; machine learning; natural language processing; pre-trained transformer; CORPORATE SOCIAL-RESPONSIBILITY; PREMATURE MORTALITY; TEXTUAL ANALYSIS; SUSTAINABILITY; POLLUTION;
D O I
10.1002/csr.3055
中图分类号
F [经济];
学科分类号
02 ;
摘要
Due to climate change and social issues, environmental, social, and governance (ESG) solutions receive increased attention and emphasis. Being influential market leaders, investors wield significant power to persuade companies to prioritize ESG considerations. However, investors' preferences for specific ESG topics and changing trends in those preferences remain elusive. Here, we build a group of large language models with 128 million parameters, named classification pre-trained transformers (CPTs), to extract the investors' tendencies toward 13 ESG-related topics from their annual reports. Assisted by the CPT models with approximately 95% cross-validation accuracy, more than 3000 annual reports released by globally 350 top investors during 2010-2021 are analyzed. Results indicate that although the investors show the strongest tendency toward the economic aspect in their annual reports, the emphasis is gradually reducing and shifting to environmental and social aspects. Nonfinancial investors like corporation and holding company investors prioritize environmental and social factors, whereas financial investors pay the most attention to governance risk. There are differences in patterns at the country level, for instance, Japan's investors show a greater focus on environmental and social factors than other major countries. Our findings suggest that investors are increasingly valuing sustainability in their decision-making. Different investor businesses may encounter unique ESG challenges, necessitating individualized strategies. Companies should improve their ESG disclosures, which are increasingly focused on environmental and social issues, to meet investor expectations and bolster transparency.
引用
收藏
页码:2051 / 2071
页数:21
相关论文
共 50 条
  • [21] Routing Generative Pre-Trained Transformers for Printed Circuit Board
    Wang, Hao
    Tu, Jun
    Bai, Shenglong
    Zheng, Jie
    Qian, Weikang
    Chen, Jienan
    2024 INTERNATIONAL SYMPOSIUM OF ELECTRONICS DESIGN AUTOMATION, ISEDA 2024, 2024, : 160 - 165
  • [22] TWilBert: Pre-trained deep bidirectional transformers for Spanish Twitter
    Gonzalez, Jose Angel
    Hurtado, Lluis-F.
    Pla, Ferran
    NEUROCOMPUTING, 2021, 426 : 58 - 69
  • [23] An Empirical Study of Pre-trained Transformers for Arabic Information Extraction
    Lan, Wuwei
    Chen, Yang
    Xu, Wei
    Ritter, Alan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4727 - 4734
  • [24] Causal Interpretation of Self-Attention in Pre-Trained Transformers
    Rohekar, Raanan Y.
    Gurwicz, Yaniv
    Nisimov, Shami
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [25] Handwritten Document Recognition Using Pre-trained Vision Transformers
    Parres, Daniel
    Anitei, Dan
    Paredes, Roberto
    DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT II, 2024, 14805 : 173 - 190
  • [26] Emotion Recognition with Pre-Trained Transformers Using Multimodal Signals
    Vazquez-Rodriguez, Juan
    Lefebvre, Gregoire
    Cumin, Julien
    Crowley, James L.
    2022 10TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2022,
  • [27] Experiments in News Bias Detection with Pre-trained Neural Transformers
    Menzner, Tim
    Leidner, Jochen L.
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT IV, 2024, 14611 : 270 - 284
  • [28] S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning
    Wang, Yabin
    Huang, Zhiwu
    Hong, Xiaopeng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [29] NODULE DETECTION IN CHEST RADIOGRAPHS WITH UNSUPERVISED PRE-TRAINED DETECTION TRANSFORMERS
    Behrendt, Finn
    Bhattacharya, Debayan
    Krueger, Julia
    Opfer, Roland
    Schlaefer, Alexander
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
  • [30] Word Representation Learning in Multimodal Pre-Trained Transformers: An Intrinsic Evaluation
    Pezzelle, Sandro
    Takmaz, Ece
    Fernandez, Raquel
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2021, 9 : 1563 - 1579