Investor's ESG tendency probed by pre-trained transformers

被引:0
|
作者
Li, Chao
Keeley, Alexander Ryota
Takeda, Shutaro
Seki, Daikichi
Managi, Shunsuke [1 ]
机构
[1] Kyushu Univ, 744 Motooka,Nishi Ku, Fukuoka 8190395, Japan
关键词
data mining; ESG; investor; machine learning; natural language processing; pre-trained transformer; CORPORATE SOCIAL-RESPONSIBILITY; PREMATURE MORTALITY; TEXTUAL ANALYSIS; SUSTAINABILITY; POLLUTION;
D O I
10.1002/csr.3055
中图分类号
F [经济];
学科分类号
02 ;
摘要
Due to climate change and social issues, environmental, social, and governance (ESG) solutions receive increased attention and emphasis. Being influential market leaders, investors wield significant power to persuade companies to prioritize ESG considerations. However, investors' preferences for specific ESG topics and changing trends in those preferences remain elusive. Here, we build a group of large language models with 128 million parameters, named classification pre-trained transformers (CPTs), to extract the investors' tendencies toward 13 ESG-related topics from their annual reports. Assisted by the CPT models with approximately 95% cross-validation accuracy, more than 3000 annual reports released by globally 350 top investors during 2010-2021 are analyzed. Results indicate that although the investors show the strongest tendency toward the economic aspect in their annual reports, the emphasis is gradually reducing and shifting to environmental and social aspects. Nonfinancial investors like corporation and holding company investors prioritize environmental and social factors, whereas financial investors pay the most attention to governance risk. There are differences in patterns at the country level, for instance, Japan's investors show a greater focus on environmental and social factors than other major countries. Our findings suggest that investors are increasingly valuing sustainability in their decision-making. Different investor businesses may encounter unique ESG challenges, necessitating individualized strategies. Companies should improve their ESG disclosures, which are increasingly focused on environmental and social issues, to meet investor expectations and bolster transparency.
引用
收藏
页码:2051 / 2071
页数:21
相关论文
共 50 条
  • [31] Do Syntax Trees Help Pre-trained Transformers Extract Information?
    Sachan, Devendra Singh
    Zhang, Yuhao
    Qi, Peng
    Hamilton, William
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 2647 - 2661
  • [32] Unsupervised Out-of-Domain Detection via Pre-trained Transformers
    Xu, Keyang
    Ren, Tongzheng
    Zhang, Shikun
    Feng, Yihao
    Xiong, Caiming
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 1052 - 1061
  • [33] On Checking Robustness on Named Entity Recognition with Pre-trained Transformers Models
    Garcia-Pablos, Aitor
    Mandravickaite, Justina
    Versinskiene, Egidija
    BALTIC JOURNAL OF MODERN COMPUTING, 2023, 11 (04): : 591 - 606
  • [34] Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees
    Bai, Jiangang
    Wang, Yujing
    Chen, Yiren
    Yang, Yaming
    Bai, Jing
    Yu, Jing
    Tong, Yunhai
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3011 - 3020
  • [35] ViTMatte: Boosting image matting with pre-trained plain vision transformers
    Yao, Jingfeng
    Wang, Xinggang
    Yang, Shusheng
    Wang, Baoyuan
    INFORMATION FUSION, 2024, 103
  • [36] Logical Transformers: Infusing Logical Structures into Pre-Trained Language Models
    Wang, Borui
    Huang, Qiuyuan
    Deb, Budhaditya
    Halfaker, Aaron
    Shao, Liqun
    McDuff, Daniel
    Awadallah, Ahmed Hassan
    Radev, Dragomir
    Gao, Jianfeng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 1762 - 1773
  • [37] Finding and Editing Multi-Modal Neurons in Pre-Trained Transformers
    Pan, Haowen
    Cao, Yixin
    Wang, Xiaozhi
    Yang, Xun
    Wang, Meng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 1012 - 1037
  • [38] Fast and accurate Bayesian optimization with pre-trained transformers for constrained engineering problemsFast and accurate Bayesian optimization with pre-trained transformers...R. Yu et al..
    Cyril Picard
    Faez Ahmed
    Structural and Multidisciplinary Optimization, 2025, 68 (3)
  • [39] Harnessing Pre-Trained Sentence Transformers for Offensive Language Detection in Indian Languages
    MKSSS Cummins College of Engineering for Women, Maharashtra, Pune, India
    不详
    不详
    CEUR Workshop Proc., (427-434):
  • [40] Biologically Inspired Design Concept Generation Using Generative Pre-Trained Transformers
    Zhu, Qihao
    Zhang, Xinyu
    Luo, Jianxi
    JOURNAL OF MECHANICAL DESIGN, 2023, 145 (04)