Factors related to user perceptions of artificial intelligence (AI)-based content moderation on social media

被引:2
|
作者
Wang, Sai [1 ]
机构
[1] Hong Kong Baptist Univ, Sch Commun, Dept Interact Media, Kowloon, 5 Hereford Rd, Hong Kong, Peoples R China
关键词
Content moderation; Algorithms; Artificial intelligence; Social media; Misinformation; VALUE PREDISPOSITIONS; PUBLIC PERCEPTIONS; SUPPORT; NANOTECHNOLOGY; TRUST; SCIENCE;
D O I
10.1016/j.chb.2023.107971
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Artificial intelligence (AI)-based moderation systems have been increasingly used by social media companies to identify and remove inappropriate user-generated content (e.g., misinformation) on their platforms. Previous research on AI moderation has primarily focused on situational and technological factors in predicting users' perceptions of it, while little is known about the role of individual characteristics. To bridge this gap, this study examined whether and how familiarity, political ideology, and algorithm acceptance are related to perceptions of AI moderation. By analyzing survey data from a nationally representative panel in the United States (N = 4562), we found that individuals who were more familiar with AI moderation expressed less favorable perceptions of it. Those who identified themselves as liberals were more likely to view AI moderation positively than those who identified themselves as conservatives. The higher the algorithm acceptance, the more favorable the perception. Moreover, trust in AI moderation significantly mediated the relationship between these three individual characteristics (familiarity, political ideology, and algorithm acceptance) and perceptions. The findings enrich the current understanding of user responses to AI moderation and provide practical implications for policymakers and designers.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] A Survey of Artificial Intelligence Techniques for User Perceptions' Extraction from Social Media Data
    Shaikh, Sarang
    Yayilgan, Sule Yildirim
    Zoto, Erjon
    Abomhara, Mohamed
    INTELLIGENT COMPUTING, VOL 2, 2022, 507 : 627 - 655
  • [2] Commercial Versus Volunteer: Comparing User Perceptions of Toxicity and Transparency in Content Moderation Across Social Media Platforms
    Cook, Christine L.
    Patel, Aashka
    Wohn, Donghee Yvette
    FRONTIERS IN HUMAN DYNAMICS, 2021, 3
  • [3] Personalizing Content Moderation on Social Media: User Perspectives on Moderation Choices, Interface Design, and Labor
    Jhaver S.
    Zhang A.Q.
    Chen Q.Z.
    Natarajan N.
    Wang R.
    Zhang A.X.
    Proceedings of the ACM on Human-Computer Interaction, 2023, 7 (CSCW2)
  • [4] Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms
    West, Sarah Myers
    NEW MEDIA & SOCIETY, 2018, 20 (11) : 4366 - 4383
  • [5] Automatic content moderation on social media
    Karabulut, Dogus
    Ozcinar, Cagri
    Anbarjafari, Gholamreza
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (03) : 4439 - 4463
  • [6] Automatic content moderation on social media
    Dogus Karabulut
    Cagri Ozcinar
    Gholamreza Anbarjafari
    Multimedia Tools and Applications, 2023, 82 : 4439 - 4463
  • [7] Content moderation and advertising in social media platforms
    Madio, Leonardo
    Quinn, Martin
    JOURNAL OF ECONOMICS & MANAGEMENT STRATEGY, 2024,
  • [8] Artificial Intelligence (AI) in Advertising: Understanding and Schematizing the Behaviors of Social Media Users
    Argan, Metin
    Dinc, Halime
    Kaya, Sabri
    Argan, Mehpare Tokay
    ADCAIJ-ADVANCES IN DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE JOURNAL, 2022, 11 (03): : 331 - 348
  • [9] Perceptions of AI Ethics on Social Media
    Ocal, Ayse
    2023 IEEE INTERNATIONAL SYMPOSIUM ON ETHICS IN ENGINEERING, SCIENCE, AND TECHNOLOGY, ETHICS, 2023,
  • [10] A Unified Generative Artificial Intelligence Approach for Converting Social Media Content
    Bonde, Lossan
    Dembele, Severin
    2024 7TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, BIG DATA, COMPUTING AND DATA COMMUNICATION SYSTEMS, ICABCD 2024, 2024,