Robust non-convex least squares loss function for regression with outliers

被引:54
|
作者
Wang, Kuaini [1 ]
Zhong, Ping [1 ]
机构
[1] China Agr Univ, Coll Sci, Beijing 100083, Peoples R China
基金
中国国家自然科学基金;
关键词
Least squares support vector regression; Robust; Iterative strategy; Loss function; DC program; SUPPORT VECTOR MACHINE; TIME-SERIES PREDICTION; STATISTICAL COMPARISONS; ALGORITHM; CLASSIFIERS; PARAMETERS; STRATEGY; MODELS; PSO;
D O I
10.1016/j.knosys.2014.08.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a robust scheme for least squares support vector regression (LS-SVR), termed as RLS-SVR, which employs non-convex least squares loss function to overcome the limitation of LS-SVR that it is sensitive to outliers. Non-convex loss gives a constant penalty for any large outliers. The proposed loss function can be expressed by a difference of convex functions (DC). The resultant optimization is a DC program. It can be solved by utilizing the Concave-Convex Procedure (CCCP). RLS-SVR iteratively builds the regression function by solving a set of linear equations at one time. The proposed RLS-SVR includes the classical LS-SVR as its special case. Numerical experiments on both artificial datasets and benchmark datasets confirm the promising results of the proposed algorithm. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:290 / 302
页数:13
相关论文
共 50 条