The theory and application of penalized methods or Reproducing Kernel Hilbert Spaces made easy

11/08/2011
by   Nancy Heckman, et al.
0

The popular cubic smoothing spline estimate of a regression function arises as the minimizer of the penalized sum of squares ∑_j(Y_j - μ(t_j))^2 + λ∫_a^b [μ"(t)]^2 dt, where the data are t_j,Y_j, j=1,..., n. The minimization is taken over an infinite-dimensional function space, the space of all functions with square integrable second derivatives. But the calculations can be carried out in a finite-dimensional space. The reduction from minimizing over an infinite dimensional space to minimizing over a finite dimensional space occurs for more general objective functions: the data may be related to the function μ in another way, the sum of squares may be replaced by a more suitable expression, or the penalty, ∫_a^b [μ"(t)]^2 dt, might take a different form. This paper reviews the Reproducing Kernel Hilbert Space structure that provides a finite-dimensional solution for a general minimization problem. Particular attention is paid to penalties based on linear differential operators. In this case, one can sometimes easily calculate the minimizer explicitly, using Green's functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset