Fitting a least squares piecewise linear continuous curve in two dimensions

被引:7
|
作者
Kundu, S [1 ]
Ubhaya, VA
机构
[1] Louisiana State Univ, Dept Comp Sci, Baton Rouge, LA 70803 USA
[2] N Dakota State Univ, Dept Comp Sci & Operat Res, Fargo, ND 58105 USA
关键词
least squares regression; nonlinear regression; piecewise linear continuous curve; convexity; optimization; algorithms; complexity;
D O I
10.1016/S0898-1221(00)00337-0
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
An optimal piecewise linear continuous fit to a given set of n data points D = {(x(i),yi) : 1 less than or equal to i less than or equal to n} in two dimensions consists of a continuous curve defined by k linear segments {L-1, L-2,..., L-k} which minimizes a weighted least squares error function with weight w(i) at (x(i),y(i)), where k greater than or equal to 1 is a given integer. A key difficulty here is the fact that the linear segment L-j, which approximates a subset of consecutive data points D-j subset of D in an optimal solution, is not necessarily an optimal fit in itself for the points D-j. We solve the problem for the special case k = 2 by showing that an optimal solution essentially consists of two least squares linear regression lines in which the weight w(j) of some data point (x(j), y(j)) is split into the weights lambdaw(j) and (1 - lambda )w(j), 0 less than or equal to lambda less than or equal to 1, for computations of these lines. This gives an algorithm of worst-case complexity O(n) for finding an optimal solution for the case k = 2. (C) 2001 Elsevier Science Ltd. All rights reserved.
引用
收藏
页码:1033 / 1041
页数:9
相关论文
共 50 条