# Smoothing Spline

Letting $$\Phi$$ be a matrix with $$\Phi(i,j) = f_j(x_i)$$, the objective function can be written as: $(y-\Phi\beta)^T(y-\Phi\beta)+\lambda\beta^T\Omega_\Phi\beta$ where $$\Omega_\Phi(i,j)=\int f_i^{(k+1)/2}(t)f_j^{(k+1)/2}(t) d t$$ By simple calculus, the coefficients $$\hat{\beta}$$ that minimize the above function can be written as: $\hat{\beta} = (\Phi^T\Phi+\lambda\Omega_\Phi)^{-1}\Phi^Ty$ The predicted values is a linear function of the observed values: $\hat{y} = \Phi(\Phi^T\Phi+\lambda\Omega_\Phi)^{-1}\Phi^Ty = S_\lambda y$ The degree of freedom for a smoothing spline are: $Trace(S_\lambda) = S_\lambda(1,1) + S_\lambda(2,2) + ... + S_\lambda(n,n)$

### Choosing the regularization parameter $$\lambda$$ by Cross Validation

$RSS_{LOOCV}(\lambda) = \frac{1}{n}\sum_{i=1}^n\Big(y_i-\hat{f}^{-1}(x_i)\Big)^2 = \frac{1}{n}\sum_{i=1}^n\Big(\frac{y_i-\hat{f}(x_i)}{1-S_{ii}}\Big)^2$

library(MASS)
help(mcycle)
x <- mcycle$times y <- mcycle$accel
help(smooth.spline)
n <- length(unique(x))
gcv <- numeric(100)
df <- seq(2,n,l=100)