Limit Sup of mean square error of 2 Gaussian process

56 Views Asked by At

I have been struggling to proceed to how to proof this theorem. I saw this theorem in statistics specially in spatial statistics. I have seen a proof to which I understand but I want to use a different approach ( I will post a link below). So let me describe it. Assume that $Z(x)$ has mean zero and is observe,then the best linear unbiased prediction (BLUP) at an (unobserved) location $x_∗$ is $\hat{Z}{(x_{∗})} = c^{*^T}C^{-1}Z(x)$ where $Z(x) = (Z(x_1),...,Z(x_n))', C_{ij} = K(x_i, x_j )$, and $c^{*}_i = K(x_i, x_∗)$.

Suppose $G(0, V_{1}=C_{1} + \sigma^{2})$ and $G(0, V_{0} = C_{0} + \sigma^{2})$ are two gaussian measures on a random field $D$, we need to show that, for an best linear unbiased predictor $\hat{Z}{(s_{o})} = c^{*^T}C^{-1}Z(x)$

As $ n - > \infty $,

$$ lim \;sup \frac{MSE_1(\hat{Z}{(x_{∗})})}{{MSE_0(\hat{Z}{(x_{∗})})}} \rightarrow 1$$. Consider the following to help you proof that:

  1. $C_1$ can be written as $EDE'$ a SVD. and EE' and orthogonal.

  2. $C_0$ can be written as $ULU'$ a SVD. and UU' are orthogonal.

    I want to write $ {MSE_1(\hat{Z}{(x_{∗})})} = E (Z(x*)- \hat{Z}{(x_{∗}}))^2$ and proceed by using $ \hat{Z}{(x_{∗})}$ and hint 1 and 2. Thank you

https://arxiv.org/pdf/1108.1851.pdf