On the Impact of Data Sampling on Hyper-parameter Optimisation of Recommendation Algorithms

被引:2
|
作者
Montanari, Matteo [1 ]
Bernardis, Cesare [1 ]
Cremonesi, Paolo [1 ]
机构
[1] Politecn Milan, Milan, Italy
关键词
Recommender Systems; Optimisation; Hyper-parameter; Sampling;
D O I
10.1145/3477314.3507158
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Hyper-parameter optimisation (HPO) is a fundamental task that must be performed in order to achieve the highest accuracy performance that a recommendation algorithm can provide. In the recent past, with the the growth of dataset sizes, the amount of resources and time needed to perform the optimisation dramatically increased. Sampling the data used during the HPO procedure allows reducing the required resources, but it impacts the accuracy metric score. In this paper, we study the effects of optimising the hyper-parameters through a random search, sampling the users in a dataset. The results of our experiments show that sampling reduces the amount of time needed to conduct HPO, but it also influences differently the accuracy of the best configuration found by HPO, depending on the algorithm optimised and the dataset selected.
引用
收藏
页码:1399 / 1402
页数:4
相关论文
共 50 条
  • [31] Bayesian Optimization for Accelerating Hyper-parameter Tuning
    Vu Nguyen
    2019 IEEE SECOND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND KNOWLEDGE ENGINEERING (AIKE), 2019, : 302 - 305
  • [32] On hyper-parameter selection for guaranteed convergence of RMSProp
    Liu, Jinlan
    Xu, Dongpo
    Zhang, Huisheng
    Mandic, Danilo
    COGNITIVE NEURODYNAMICS, 2022, 18 (6) : 3227 - 3237
  • [33] Efficient Hyper-parameter Optimization with Cubic Regularization
    Shen, Zhenqian
    Yang, Hansi
    Li, Yong
    Kwok, James
    Yao, Quanming
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [34] A Comparative study of Hyper-Parameter Optimization Tools
    Shekhar, Shashank
    Bansode, Adesh
    Salim, Asif
    2021 IEEE ASIA-PACIFIC CONFERENCE ON COMPUTER SCIENCE AND DATA ENGINEERING (CSDE), 2021,
  • [35] ONLINE HYPER-PARAMETER TUNING FOR THE CONTEXTUAL BANDIT
    Bouneffouf, Djallel
    Claeys, Emmanuelle
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3445 - 3449
  • [36] Hyper-Parameter Tuning for the (1+(λ, λ)) GA
    Nguyen Dang
    Doerr, Carola
    PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, : 889 - 897
  • [37] Modified Grid Searches for Hyper-Parameter Optimization
    Lopez, David
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, HAIS 2020, 2020, 12344 : 221 - 232
  • [38] Hybrid Hyper-parameter Optimization for Collaborative Filtering
    Szabo, Peter
    Genge, Bela
    2020 22ND INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING (SYNASC 2020), 2020, : 210 - 217
  • [39] Hyper-parameter Tuning under a Budget Constraint
    Lu, Zhiyun
    Chen, Liyu
    Chiang, Chao-Kai
    Sha, Fei
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5744 - 5750
  • [40] A New Baseline for Automated Hyper-Parameter Optimization
    Geitle, Marius
    Olsson, Roland
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, 2019, 11943 : 521 - 530