Adaptive nonparametric regression with the K-nearest neighbour fused lasso

被引:16
|
作者
Padilla, Oscar Hernan Madrid [1 ]
Sharpnack, James [2 ]
Chen, Yanzhen [3 ]
Witten, Daniela M. [4 ]
机构
[1] Univ Calif Los Angeles, Dept Stat, 520 Portola Plaza, Los Angeles, CA 90095 USA
[2] Univ Calif Davis, Dept Stat, One Shields Ave, Davis, CA 95616 USA
[3] Hong Kong Univ Sci & Technol, Dept Informat Syst Business Stat & Operat Managem, Kowloon, Clear Water Bay, Hong Kong, Peoples R China
[4] Univ Washington, Dept Stat, Seattle, WA 98195 USA
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
Fused lasso; Local adaptivity; Manifold adaptivity; Nonparametric regression; Total variation; MINIMIZATION; FRAMEWORK; ALGORITHM; PATH;
D O I
10.1093/biomet/asz071
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
The fused lasso, also known as total-variation denoising, is a locally adaptive function estimator over a regular grid of design points. In this article, we extend the fused lasso to settings in which the points do not occur on a regular grid, leading to a method for nonparametric regression. This approach, which we call the K-nearest-neighbours fused lasso, involves computing the K-nearest-neighbours graph of the design points and then performing the fused lasso over this graph. We show that this procedure has a number of theoretical advantages over competing methods: specifically, it inherits local adaptivity from its connection to the fused lasso, and it inherits manifold adaptivity from its connection to the K-nearest-neighbours approach. In a simulation study and an application to flu data, we show that excellent results are obtained. For completeness, we also study an estimator that makes use of an E -graph rather than a K-nearest-neighbours graph and contrast it with the K-nearest-neighbours fused lasso.
引用
收藏
页码:293 / 310
页数:18
相关论文
共 50 条
  • [1] k-Nearest Neighbour method in functional nonparametric regression
    Burba, Florent
    Ferraty, Frederic
    Vieu, Philippe
    JOURNAL OF NONPARAMETRIC STATISTICS, 2009, 21 (04) : 453 - 469
  • [2] Adaptive K-nearest neighbour algorithm for WiFi fingerprint positioning
    Oh, Jongtaek
    Kim, Jisu
    ICT EXPRESS, 2018, 4 (02): : 91 - 94
  • [3] ASYMPTOTIC DISTRIBUTION OF ROBUST k-NEAREST NEIGHBOUR ESTIMATOR FOR FUNCTIONAL NONPARAMETRIC MODELS
    Attouch, Mohammed Kadi
    Benchikh, Tawfik
    MATEMATICKI VESNIK, 2012, 64 (04): : 275 - 285
  • [4] Size of wallet estimation: Application of K-nearest neighbour and quantile regression
    Jhamtani, Aashish
    Mehta, Ritu
    Singh, Sanjeet
    IIMB MANAGEMENT REVIEW, 2021, 33 (03) : 184 - 190
  • [5] Balanced k-nearest neighbour imputation
    Hasler, Caren
    Tille, Yves
    STATISTICS, 2016, 50 (06) : 1310 - 1331
  • [6] k-Nearest Neighbour Classifiers - A Tutorial
    Cunningham, Padraig
    Delany, Sarah Jane
    ACM COMPUTING SURVEYS, 2021, 54 (06)
  • [7] Learning K-Nearest Neighbour Regression for Noisy Dataset with Application in Indoor Localization
    Sheikholeslami, Nima
    Valaee, Shahrokh
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [8] A binary neural k-nearest neighbour technique
    Victoria J. Hodge
    Jim Austin
    Knowledge and Information Systems, 2005, 8 : 276 - 291
  • [9] A stacking weighted k-Nearest neighbour with thresholding
    Rastin, Niloofar
    Taheri, Mohammad
    Jahromi, Mansoor Zolghadri
    INFORMATION SCIENCES, 2021, 571 : 605 - 622
  • [10] Exact bagging with k-nearest neighbour classifiers
    Caprile, B
    Merler, S
    Furlanello, C
    Jurman, G
    MULTIPLE CLASSIFIER SYSTEMS, PROCEEDINGS, 2004, 3077 : 72 - 81