Personality BERT: A Transformer-Based Model for Personality Detection from Textual Data

被引:8
|
作者
Jain, Dipika [1 ]
Kumar, Akshi [2 ]
Beniwal, Rohit [1 ]
机构
[1] Delhi Technol Univ, Dept Comp Sci & Engn, New Delhi, India
[2] Netaji Subhas Univ Technol, Dept Informat Technol, New Delhi, India
关键词
Personality; BERT; Text; Classification;
D O I
10.1007/978-981-19-0604-6_48
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Understanding personality type can aid in understanding people preferences and associated cognitive processes. Automated personality detection can commendably help NLP experts and psychoanalysts to identify the dominant or distinguishing qualities of a person. At its basic level, a personality is expressed through a person's temperament or emotional tone. Pertinent studies validate linguistic cues in written and spoken text as a coherent and consistent mode of assessing and interpreting personality. With the proliferation of social media applications, the psycholinguistic markers in user's online posts can facilitate comprehending variations in personalities. Transformer models have emerged as new generation NLP models and are already being implemented to benefit an array of NLP use cases. This research puts forward a transformer-based model for personality detection from textual data. The proposed personality BERT is a textual modality-specific deep neural model that fine-tunes a pretrained bidirectional representation for transformers (BERT) for the personality classification task. Kaggle's MBTI dataset is used to evaluate and validate the proposed model. An fl score of 0.6945 is reported.
引用
收藏
页码:515 / 522
页数:8
相关论文
共 50 条
  • [21] Transformer-Based Fire Detection in Videos
    Mardani, Konstantina
    Vretos, Nicholas
    Daras, Petros
    SENSORS, 2023, 23 (06)
  • [22] Transformer-based fall detection in videos
    Nunez-Marcos, Adrian
    Arganda-Carreras, Ignacio
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 132
  • [23] Transformer-based Text Detection in the Wild
    Raisi, Zobeir
    Naiel, Mohamed A.
    Younes, Georges
    Wardell, Steven
    Zelek, John S.
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 3156 - 3165
  • [24] Transformer-based LLMs for Sensor Data
    Okita, Tsuyoshi
    Ukita, Kosuke
    Matsuishi, Koki
    Kagiyama, Masaharu
    Hirata, Kodai
    Miyazaki, Asahi
    ADJUNCT PROCEEDINGS OF THE 2023 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING & THE 2023 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTING, UBICOMP/ISWC 2023 ADJUNCT, 2023, : 499 - 504
  • [25] Transformer-based DNA methylation detection on ionic signals from Oxford Nanopore sequencing data
    Wang, Xiuquan
    Ahsan, Mian Umair
    Zhou, Yunyun
    Wang, Kai
    QUANTITATIVE BIOLOGY, 2023, 11 (03) : 287 - 296
  • [26] A transformer-based model for effective and exportable IoMT-based stress detection
    Alshareef, Moudy Sharaf
    Alturki, Badraddin
    Jaber, Mona
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1158 - 1163
  • [27] MaskChanger: A Transformer-Based Model Tailoring Change Detection with Mask Classification
    Ebrahimzadeh, Mohammad
    Manzuri, Mohammad Taghi
    PROCEEDINGS OF THE 13TH IRANIAN/3RD INTERNATIONAL MACHINE VISION AND IMAGE PROCESSING CONFERENCE, MVIP, 2024, : 124 - 129
  • [28] Anomaly detection in smart manufacturing: An Adaptive Adversarial Transformer-based model
    Orabi, Moussab
    Tran, Kim Phuc
    Egger, Philipp
    Thomassey, Sebastien
    JOURNAL OF MANUFACTURING SYSTEMS, 2024, 77 : 591 - 611
  • [29] Transformer-based Encoder-Decoder Model for Surface Defect Detection
    Lu, Xiaofeng
    Fan, Wentao
    6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 125 - 130
  • [30] End-to-End Transformer-Based Models in Textual-Based NLP
    Rahali, Abir
    Akhloufi, Moulay A.
    AI, 2023, 4 (01) : 54 - 110