Finding $MSE$ of optimal estimator

69 Views Asked by At

Background of the question:
We know that $X$ is a continuous random variable that has $P\left(-1\le X\le 1\right)=1$
and $f_X\left(x\right)<\infty $
We define $X(n)=cos(\pi n X)$
$E[X]=\mu$ and $Var[X]=\sigma ^2$
We found in previous sections:
$E\left[X\left(n\right)\right]=\int _{-1}^1f_X\left(x\right)cos\left(\pi nx\right)dx$
$X(n)$ is not W.S.S
We had another random variable $Z$ which is $Y=\begin{cases}a\left|X\right|&Z\ge 0\\ b\left|X\right|&Z<0\end{cases}$ and $Z$~$N(0,1)$
$Z$ is statistic independent of $X$
We found:
$\hat{Y}_{opt}\left(X\right)=\left|X\right|\left(\frac{a+b}{2}\right)$
$MSE\left\{\hat{Y\:}_{opt}\left(X\right)\right\}=E\left[\left(\hat{Y\:}_{opt}\left(X\right)-Y\right)^2\right]=\left(\sigma ^2+\mu ^2\right)\left(\frac{a-b}{2}\right)^2$
Showing how I reached the MSE:
$E\left[\left(\hat{Y}_{opt}\left(X\right)-Y\right)^2\right]=E\left[\left(\left|X\right|\left(\frac{a+b}{2}\right)-Y\right)^2\right]=E\left[\left|X\right|^2\left(\frac{a+b}{2}\right)^2-2Y\left|X\right|\left(\frac{a+b}{2}\right)+Y^2\right]=\left(\frac{a+b}{2}\right)^2E\left[X^2\right]-\left(a+b\right)E\left[Y\left|X\right|\right]+E\left[Y^2\right]=\left(\sigma^2+\mu^2\right)\left(\frac{a-b}{2}\right)^2$
Regarding how I proved the optimal estimator, I did it the same was as where I got help here Expected value of $E\left[Y|X\right]$ with 3 random variables $X$ and $Y$ and $Z$ and the $MSE$ of it
Just without dependency, its basically the same, using law of expected value
My assumptions here were:
Its true since $|X|^2$=$X^2$ and also $X\cdot\left|X\right|=X^2$
Thanks to Fred Li I realized my assumptions are wrong, now I will try to fix the second part at the expected value, since $X\cdot\left|X\right|\ne X^2$

What I need to prove:
We need to prove that the best estimator of $Y$ from samples of {$X(n)$}$_{_{_{n=0,1,...}}}$ is linear, which means:
$\sum _{n=0}^{\infty }\:a_nX\left(n\right)$ with $\left\{a_n\right\}_{_{_{n=0,1,..}}}$
We need to find $a_n$ as functions of $f_X(x)$ and find the MSE of the $\hat{Y}_{opt}\left(X\right)$ using $a$,$b$,$\mu$,$\sigma$

My try:
since $\hat{Y}_{opt}\left(X\right)=\left|X\right|\left(\frac{a+b}{2}\right)$
We can define $g(X)$ such that:
$g\left(X\right):=\left|X\right|\left(\frac{a+b}{2}\right)$
We see that $g(X)$ is even, continuous at $[-1,1]$, thus she has expansion of Fourier series with $cos$ only
And for that reason, the Fourier series of $g(X)$: $\frac{a_0}{2}+\sum _{n=1}^{\infty }\:a_ncos\left(\pi nX\right)$ converges to $g(X)$ at $L^2$
We see that this estimator is a function of $X$.
thus, all we have to do is it achieves the same $MSE$ as the optimal estimator.
Which means basically:
$$E\left[\left(\hat{Y\:}_{_{lin}}\left(X\right)-\hat{Y\:}_{opt}\left(X\right)\right)^2\right]=0$$
Problem: That is my problem here, I do not really know how to prove it, although it should be right.
Finding the coefficient of the fourier series $a_n$ is not hard, just like finding fourier coefficients, but my problem is with the $MSE$

Any ideas will be appreciated!!

1

There are 1 best solutions below

3
On BEST ANSWER

Correct me if I am wrong, but the following claim seems to hold true. For any square-integrable even function $g(x)>0$ defined on $[-1,1]$, its Fourier series converges in $L_2$ by the Riesz-Fischer theorem: $$\lim_{N\to\infty} \int_{-1}^1\Big(g(x) - \sum_{n=0}^Na_n \cos(n\pi x) \Big)^2dx= 0. $$ Then, let $X$ denote a random variable with a continuous density function $f_X(x)<K$ defined on $[-1,1]$. The boundedness of $f_X(x)$ directly leads to the following result, which hopefully should solve your problem: $$ E\left[\Big(g(X) - \sum_{n=0}^\infty a_n \cos(n\pi X) \Big)^2\right] =\lim_{N\to\infty} E\left[\Big(g(X) - \sum_{n=0}^Na_n \cos(n\pi X) \Big)^2\right]\\ =\lim_{N\to\infty} \int_{-1}^1\Big(g(x) - \sum_{n=0}^N a_n \cos(n\pi x) \Big)^2f(x)dx\leq \lim_{N\to\infty} K\int_{-1}^1\Big(g(x) - \sum_{n=0}^N a_n \cos(n\pi x) \Big)^2dx=0. $$