Improving the speed of support vector regression using regularized least square regression

被引:0
|
作者
Pirmard S.S. [1 ]
Forghani Y. [2 ]
机构
[1] Computer Department, Imam Reza International University, Mashhad
[2] Computer Department, Islamic Azad University, Mashhad Branch, Mashhad
来源
Ingenierie des Systemes d'Information | 2020年 / 25卷 / 04期
关键词
Function estimation; Regularized least square (RLS); Runtime; ε-insensitive support vector regression (ε-SVR);
D O I
10.18280/isi.250404
中图分类号
学科分类号
摘要
The Regularized Least Square (RLS) method is one of the fastest function estimation methods, but it is too sensitive to noise. Against this, ε-insensitive Support Vector Regression (ε-SVR) is robust to noise but doesn't have a good runtime. ε-SVR supposes that the noise level is at most ε. Therefore, the center of a tube with radius ε, which is used as the estimated function, is determined in a way that the training data are located in that tube. Therefore, this method is robust to such noisy data. In this paper, to improve the runtime of ε-SVR, first, an initial estimated function is obtained using the RLS method. Then, unlike the ε-SVR model, which uses all the data to determine the lower and upper limits of the tube, our proposed method uses the initial estimated function for determining the tube and the final estimated function. Strictly speaking, the data below and above the initial estimated function are used to estimate the upper and lower limits of the tube, respectively. Thus, the number of the model constraints and, consequently, the model runtime are reduced. The experiments carried out on 15 benchmark data sets confirm that our proposed method is faster than ε-SVR, ε-TSVR and pair v-SVR, and its accuracy are comparable with that of ε-SVR, ε-TSVR and pair v-SVR. © 2020 International Information and Engineering Technology Association. All rights reserved.
引用
收藏
页码:427 / 435
页数:8
相关论文
共 50 条
  • [21] A new regularized least squares support vector regression for gene selection
    Pei-Chun Chen
    Su-Yun Huang
    Wei J Chen
    Chuhsing K Hsiao
    BMC Bioinformatics, 10
  • [22] Combing Random Forest and Least Square Support Vector Regression for Improving Extreme Rainfall Downscaling
    Quoc Bao Pham
    Yang, Tao-Chang
    Kuo, Chen-Min
    Tseng, Hung-Wei
    Yu, Pao-Shan
    WATER, 2019, 11 (03)
  • [23] Estimation of the depth of anesthesia by using a multioutput least-square support vector regression
    Jahanseir, Mercedeh
    Setarehdan, Seyed Kamaledin
    Momenzadeh, Sirous
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2018, 26 (06) : 2792 - 2801
  • [24] Regression models using pattern search assisted least square support vector machines
    Patil, NS
    Shelokar, PS
    Jayaraman, VK
    Kulkarni, BD
    CHEMICAL ENGINEERING RESEARCH & DESIGN, 2005, 83 (A8): : 1030 - 1037
  • [25] Gradient/Hessian-enhanced least square support vector regression
    Jiang, Ting
    Zhou, XiaoJian
    INFORMATION PROCESSING LETTERS, 2018, 134 : 1 - 8
  • [26] REGULARIZED LEAST SQUARE KERNEL REGRESSION FOR STREAMING DATA
    Zheng, Xiaoqing
    Sun, Hongwei
    Wu, Qiang
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2021, 19 (06) : 1533 - 1548
  • [27] Craniofacial Reconstruction based on-Least Square Support Vector Regression
    Li, Yan
    Chang, Liang
    Qiao, Xuejun
    Liu, Rong
    Duan, Fuqing
    2014 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2014, : 1147 - 1151
  • [28] The sparse least square support vector regression for estimating illumination chromaticity
    Zhu, Zhenmin
    Liu, Quanxin
    Song, Ruichao
    Chen, Shiming
    COLOR RESEARCH AND APPLICATION, 2018, 43 (04): : 517 - 526
  • [29] Regularized Least Square Regression with Unbounded and Dependent Sampling
    Chu, Xiaorong
    Sun, Hongwei
    ABSTRACT AND APPLIED ANALYSIS, 2013,
  • [30] REGULARIZED LEAST SQUARE REGRESSION WITH SPHERICAL POLYNOMIAL KERNELS
    Li, Luoqing
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2009, 7 (06) : 781 - 801