Assessing Algorithmic Biases for Musical Version Identification

被引:0
|
作者
Yesiler, Furkan [1 ]
Miron, Marius [1 ]
Serra, Joan [2 ]
Gomez, Emilia [3 ]
机构
[1] Pompeu Fabra Univ, Mus Technol Grp, Barcelona, Spain
[2] Dolby Labs, Barcelona, Spain
[3] European Commiss, Joint Res Ctr, Seville, Spain
基金
欧盟地平线“2020”;
关键词
information retrieval; version identification; algorithmic bias;
D O I
10.1145/3488560.3498397
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Version identification (VI) systems now offer accurate and scalable solutions for detecting different renditions of a musical composition, allowing the use of these systems in industrial applications and throughout the wider music ecosystem. Such use can have an important impact on various stakeholders regarding recognition and financial benefits, including how royalties are circulated for digital rights management. In this work, we take a step toward acknowledging this impact and consider VI systems as socio-technical systems rather than isolated technologies. We propose a framework for quantifying performance disparities across 5 systems and 6 relevant side attributes: gender, popularity, country, language, year, and prevalence. We also consider 3 main stakeholders for this particular information retrieval use case: the performing artists of query tracks, those of reference (original) tracks, and the composers. By categorizing the recordings in our dataset using such attributes and stakeholders, we analyze whether the considered VI systems show any implicit biases. We find signs of disparities in identification performance for most of the groups we include in our analyses. We also find that learning- and rule-based systems behave differently for some attributes, which suggests an additional dimension to consider along with accuracy and scalability when evaluating VI systems. Lastly, we share our dataset to encourage VI researchers to take these aspects into account while building new systems.
引用
收藏
页码:1284 / 1290
页数:7
相关论文
共 50 条
  • [41] Ethical perspective in artificial intelligence: decoding discriminatory biases in algorithmic decisions
    Charbonneau, Sandrine
    PHARES-REVUE PHILOSOPHIQUE ETUDIANTE DE L UNIVERSITE LAVAL, 2019, 19 (02): : 53 - 72
  • [42] An adversarial training framework for mitigating algorithmic biases in clinical machine learning
    Yang, Jenny
    Soltan, Andrew A. S.
    Eyre, David W.
    Yang, Yang
    Clifton, David A.
    NPJ DIGITAL MEDICINE, 2023, 6 (01)
  • [43] An adversarial training framework for mitigating algorithmic biases in clinical machine learning
    Jenny Yang
    Andrew A. S. Soltan
    David W. Eyre
    Yang Yang
    David A. Clifton
    npj Digital Medicine, 6
  • [44] Applying preference biases to conjunctive and disjunctive version spaces
    Smirnov, EN
    van den Herik, HJ
    ARTIFICIAL INTELLIGENCE: METHODOLOGY, SYSTEMS, APPLICATIONS, PROCEEDINGS, 2000, 1904 : 321 - 330
  • [45] Buchner 'Woyzeck' - Musical version by Robert Wilson
    Kalb, J
    THEATER, 2003, 33 (02) : 92 - 96
  • [46] 'JEFF WAYNES MUSICAL VERSION OF THE WAR OF THE WORLDS'
    不详
    TLS-THE TIMES LITERARY SUPPLEMENT, 1978, (3983): : 888 - 888
  • [47] Concurrent musical pitch height biases judgment of visual brightness
    Hong, You Jeong
    Choi, Ahyeon
    Lee, Chae-Eun
    Cho, Woojae
    Yoon, Sumin
    Lee, Kyogu
    PSYCHOLOGY OF MUSIC, 2024,
  • [48] EFFECTS OF AUTHORITY FIGURE BIASES ON CHANGING JUDGMENTS OF MUSICAL EVENTS
    RADOCY, RE
    JOURNAL OF RESEARCH IN MUSIC EDUCATION, 1976, 24 (03) : 119 - 128
  • [49] Assessing the accuracy of algorithmic haplotype inference.
    Windemuth, A
    Salisbury, BA
    Judson, RS
    Stephens, JC
    AMERICAN JOURNAL OF HUMAN GENETICS, 2002, 71 (04) : 569 - 569
  • [50] Algorithmic management: Assessing the impacts of AI at work
    Kelly-Lyth, Aislinn
    Thomas, Anna
    EUROPEAN LABOUR LAW JOURNAL, 2023, 14 (02) : 230 - 252