Bound on variance of random process when signal is known

45 Views Asked by At

I am reading this paper (link to a Nature paper, may not be accessible) and I encountered the following. I have very little experience in probability theory and I could not find much helpful in standard textbook.

it follows from the law of total variance in probability theory, that for a random process $x_t$ and a signal $u^t_0$ , the variance of $x_t$ is lower bounded by the estimation error $E[x_t − \hat{x}^t]^2$, where the conditional expectation $\hat{x} = E\{x_t \mid u^t_0\}$ is also the minimum mean squared error estimator of $x_t$ given $u_t^0$.

Even a link to any reference which established this results will be very helpful.

1

There are 1 best solutions below

0
On BEST ANSWER

Let us write $x$, $\hat x$ and $u$ for your $x_t$, $\hat x^t$ (or is it $\hat x$?) and $u_0^t$ (or is it $u_t^0$?), respectively.

By definition of conditional expectation, $x-\hat x$ is orthogonal to the space $L^2(u)$ of square integrable random variables measurable with respect to $u$. Since $\hat x$ is in $L^2(u)$, $x-\hat x$ and $\hat x-E(x)$ are orthogonal and $$\mathrm{var}(x)=E((x-E(x))^2)=E((x-\hat x)^2)+E((\hat x-E(x))^2)\geqslant E((x-\hat x)^2).$$ To sum up, $E((x-\hat x)^2)$ realizes the minimum of $E((x-z)^2)$ over every $z$ in $L^2(u)$ while $\mathrm{var}(x)$ realizes the minimum of $E((x-z)^2)$ over every $z$ constant, hence the former is at most the latter.