Core Concepts
The core message of this article is to present a generalized formulation for reweighted least squares approximations, where the solution can be expressed as a convex combination of certain interpolants. The authors also provide a general strategy to iteratively update the weights according to the approximation error and apply it to the spline fitting problem, allowing for the preservation of sharp features in the final model.
Abstract
The article has two main parts:
Theoretical part:
The authors present a generalized formulation for reweighted least squares approximations, proving that the solution can be expressed as a convex combination of certain interpolants when the solution is sought in any finite-dimensional vector space.
They show that this formulation encompasses various function spaces, such as polynomial spaces and spline spaces.
The authors derive consequences of this interpolatory formulation, including pointwise error bounds and the influence of the weights.
Practical part:
The authors focus on spline models and introduce the concept of markers, which represent important features (type I) and noisy data/outliers (type II) in the input data.
They propose a reweighted least squares algorithm that iteratively updates the weights based on the approximation error, preserving the type I markers and downweighting the type II markers.
The algorithm is extended to an adaptive spline fitting scheme, where the spline space is iteratively refined in regions with high approximation error.
Numerical experiments are provided to demonstrate the performance of the proposed fitting schemes for curve and surface approximation, including adaptive spline constructions.