Consider the regression model with data $(Y_i, x_i)_{i=1}^{n}$ (i.e. real-valued target and univariate predictor $x_i$) $$ Y_i = f(x_i) + \epsilon $$ with $\epsilon$ iid, $E(\epsilon) = 0$ and $\operatorname{Var}(\epsilon) = \sigma^2$.
Suppose the true underlying function $f$ is three-times differentiable and is taken from an appropriate Sobolev-space, i.e. $ W_2^{v} = \{f:f \ (v-1)-\text{times continuously differentiable and} \int_a^b[f^{(v)}(x)]^2 < \infty \}$ with $v=3$.
Consider a smoothing spline solving the penalized least-squares problem with smoothing parameter $\lambda \geq 0$ $$ g_r(\lambda) = \operatorname{argmin}_{f \in W_2^{r}}1/n \sum_i \left[Y_i - f(x_i)\right]^2 + \lambda \int \left[f^{(r)}(x)\right]^2dx. $$
I am interested in the asymptotic behaviour $(n \to \infty)$ (where $n$ is the sample size) of a cubic smoothing spline and $v > r = 2$. In particular, at which mean-squared error rate does the cubic smoothing spline converge (since it is only twice differentiable and belongs to Sobolev space with $r=2$?)?
To make this more precise, due to Speckman (1985) I know that every linear smoother has a (minimax) convergence rate of $O(n^{-2v/(2v+1)})$. Does the cubic smoothing spline converge at the rate $O(n^{-6/7})$ (although it is chosen of order $r$) or at the (lower) rate $O(n^{-4/5})$?
The question is basically: If I choose the order of the smoothing spline too low does the smoothing spline converge with the optimal rate of (in this example) $O(n^{-6/7})$ or lower.