Isotonic Modeling with Non-Differentiable Loss Functions with Application to Lasso Regularization

被引:6
|
作者
Painsky, Amichai [1 ]
Rosset, Saharon [1 ]
机构
[1] Tel Aviv Univ, Sch Math Sci, IL-6997801 Ramat Aviv, Israel
关键词
Isotonic regression; nonparametric regression; regularization path; GIRP; convex optimization; REGRESSION; FREEDOM;
D O I
10.1109/TPAMI.2015.2441063
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we present an algorithmic approach for fitting isotonic models under convex, yet non-differentiable, loss functions. It is a generalization of the greedy non-regret approach proposed by Luss and Rosset (2014) for differentiable loss functions, taking into account the sub-gradiental extensions required. We prove that our suggested algorithm solves the isotonic modeling problem while maintaining favorable computational and statistical properties. As our suggested algorithm may be used for any non-differentiable loss function, we focus our interest on isotonic modeling for either regression or two-class classification with appropriate log-likelihood loss and lasso penalty on the fitted values. This combination allows us to maintain the non-parametric nature of isotonic modeling, while controlling model complexity through regularization. We demonstrate the efficiency and usefulness of this approach on both synthetic and real world data. An implementation of our suggested solution is publicly available from the first author's website (https://sites.google.com/site/amichaipainsky/software).
引用
收藏
页码:308 / 321
页数:14
相关论文
共 50 条