The clustered Mallows model

被引:0
|
作者
Piancastelli, Luiza S. C. [1 ]
Friel, Nial [1 ,2 ]
机构
[1] Univ Coll Dublin, Sch Math & Stat, Belfield, Ireland
[2] Insight Ctr Data Analyt, Dublin, Ireland
基金
爱尔兰科学基金会;
关键词
Mallows model; Ranking data; Bayesian learning; Clustering; Rank aggregation; RANKING; INFERENCE;
D O I
10.1007/s11222-024-10555-w
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Rankings represent preferences that arise from situations where assessors arrange items, for example, in decreasing order of utility. Orderings of the item set are permutations (pi\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi $$\end{document}) that reflect strict preferences. However, strict preference relations can be unrealistic for real data. Common traits among items can justify equal ranks and there can also be different importance attribution to decisions that form pi\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi $$\end{document}. In large item sets, assessors might prioritise certain items, rank others low, and express indifference towards the remaining. Rank aggregation may involve decisive judgments in some parts and ambiguity in others. In this paper, we extend the famous Mallows (Biometrika 44:114-130, 1957) model (MM) to accommodate item indifference. Grouping similar items motivates the proposed Clustered Mallows Model (CMM), a MM counterpart for tied ranks with ties learned from the data. The CMM provides the flexibility to combine strictness and indifferences, describing rank collections as ordered clusters. CMM Bayesian inference is a doubly-intractable problem since the normalised model is unavailable. We overcome this with a version of the exchange algorithm (Murray et al. in Proceedings of the 22nd annual conference on uncertainty in artificial intelligence (UAI-06), 2006) and provide a pseudo-likelihood approximation as a computationally cheaper alternative. Analysis of two real-world ranking datasets is presented, showcasing the practical application of the CMM and highlighting scenarios where it offers advantages over alternative models.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Mallows and generalized Mallows model for matchings
    Irurozki, Ekhine
    Calvo, Borja
    Lozano, Jose A.
    BERNOULLI, 2019, 25 (02) : 1160 - 1188
  • [2] On Optimality of Mallows Model Averaging
    Peng, Jingfu
    Li, Yang
    Yang, Yuhong
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024,
  • [3] Identity testing for Mallows model
    Busa-Fekete, Robert
    Fotakis, Dimitris
    Szorenyi, Balazs
    Zampetakis, Manolis
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] On the sparsity of Mallows model averaging estimator
    Feng, Yang
    Liu, Qingfeng
    Okui, Ryo
    ECONOMICS LETTERS, 2020, 187
  • [5] The Postdoc Problem under the Mallows Model
    Liu, Xujun
    Milenkovic, Olgica
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 3214 - 3219
  • [6] Thermodynamic limit for the Mallows model on Sn
    Starr, Shannon
    JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (09)
  • [7] Assortment Optimization Under the Mallows model
    Desir, Antoine
    Goyal, Vineet
    Jagabathula, Srikanth
    Segev, Danny
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [8] Optimal Learning for Mallows Block Model
    Busa-Fekete, Robert
    Fotakis, Dimitris
    Szorenyi, Balazs
    Zampetakis, Manolis
    CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [9] Corrected Mallows criterion for model averaging
    Liao, Jun
    Zou, Guohua
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2020, 144
  • [10] On A Mallows-type Model For (Ranked) Choices
    Feng, Yifan
    Tang, Yuxuan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,