Adaptive nonparametric regression with the K-nearest neighbour fused lasso

被引:16
|
作者
Padilla, Oscar Hernan Madrid [1 ]
Sharpnack, James [2 ]
Chen, Yanzhen [3 ]
Witten, Daniela M. [4 ]
机构
[1] Univ Calif Los Angeles, Dept Stat, 520 Portola Plaza, Los Angeles, CA 90095 USA
[2] Univ Calif Davis, Dept Stat, One Shields Ave, Davis, CA 95616 USA
[3] Hong Kong Univ Sci & Technol, Dept Informat Syst Business Stat & Operat Managem, Kowloon, Clear Water Bay, Hong Kong, Peoples R China
[4] Univ Washington, Dept Stat, Seattle, WA 98195 USA
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
Fused lasso; Local adaptivity; Manifold adaptivity; Nonparametric regression; Total variation; MINIMIZATION; FRAMEWORK; ALGORITHM; PATH;
D O I
10.1093/biomet/asz071
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
The fused lasso, also known as total-variation denoising, is a locally adaptive function estimator over a regular grid of design points. In this article, we extend the fused lasso to settings in which the points do not occur on a regular grid, leading to a method for nonparametric regression. This approach, which we call the K-nearest-neighbours fused lasso, involves computing the K-nearest-neighbours graph of the design points and then performing the fused lasso over this graph. We show that this procedure has a number of theoretical advantages over competing methods: specifically, it inherits local adaptivity from its connection to the fused lasso, and it inherits manifold adaptivity from its connection to the K-nearest-neighbours approach. In a simulation study and an application to flu data, we show that excellent results are obtained. For completeness, we also study an estimator that makes use of an E -graph rather than a K-nearest-neighbours graph and contrast it with the K-nearest-neighbours fused lasso.
引用
收藏
页码:293 / 310
页数:18
相关论文
共 50 条
  • [41] An assessment of three variance estimators for the k-nearest neighbour technique
    Magnussen, Steen
    SILVA FENNICA, 2013, 47 (01)
  • [42] An improved k-nearest neighbour method to diagnose breast cancer
    Li, Qingbo
    Li, Wenjie
    Zhang, Jialin
    Xu, Zhi
    ANALYST, 2018, 143 (12) : 2807 - 2811
  • [43] Benchmarking k-nearest neighbour imputation with homogeneous Likert data
    Jonsson, Per
    Wohlin, Claes
    EMPIRICAL SOFTWARE ENGINEERING, 2006, 11 (03) : 463 - 489
  • [44] Rock image classification based on k-nearest neighbour voting
    Lepisto, L.
    Kunttu, I.
    Visa, A.
    IEE PROCEEDINGS-VISION IMAGE AND SIGNAL PROCESSING, 2006, 153 (04): : 475 - 482
  • [45] Encrypted Classification Using Secure K-Nearest Neighbour Computation
    Reddy, B. Praeep Kumar
    Chatterjee, Ayantika
    SECURITY, PRIVACY, AND APPLIED CRYPTOGRAPHY ENGINEERING, SPACE 2019, 2019, 11947 : 176 - 194
  • [46] Benchmarking k-nearest neighbour imputation with homogeneous Likert data
    Per Jönsson
    Claes Wohlin
    Empirical Software Engineering, 2006, 11
  • [47] A DEPTH-BASED MODIFICATION OF THE K-NEAREST NEIGHBOUR METHOD
    Vencalek, Ondrej
    Hlubinka, Daniel
    KYBERNETIKA, 2021, 57 (01) : 15 - 37
  • [48] The k-Nearest Neighbour Join: Turbo Charging the KDD Process
    Christian Böhm
    Florian Krebs
    Knowledge and Information Systems, 2004, 6 : 728 - 749
  • [49] Improving the k-Nearest Neighbour Rule by an Evolutionary Voting Approach
    Garcia-Gutierrez, Jorge
    Mateos-Garcia, Daniel
    Riquelme-Santos, Jose C.
    HYBRID ARTIFICIAL INTELLIGENCE SYSTEMS, HAIS 2014, 2014, 8480 : 296 - 305
  • [50] Estimating individual tree growth with the k-nearest neighbour and k-most similar neighbour methods
    Sironen, S
    Kangas, A
    Maltamo, M
    Kangas, J
    SILVA FENNICA, 2001, 35 (04) : 453 - 467