Density estimation with Orthogonal series

64 Views Asked by At

I have found this problem from the article http://www.yaroslavvb.com/papers/watson-density.pdf. In this article the probability density function has considered as \begin{equation} f(x) = \sum_{m=0}^{\infty}\alpha_{m}\phi_{m}(x), \end{equation} and the estimator is \begin{equation} f^{\ast}_{n}(x) = \sum_{m=0}^{\infty} \lambda_{m}a_{m}\phi_{m}(x), \end{equation} where \begin{equation} a_{m} = \frac{1}{n}\sum_{k=1}^{n}\phi_{m}(x_k). \end{equation} Here $\{\phi_{m}(x)\}$ is an orthonormal basis. According to the article \begin{equation} E \int (f(x)-f^{\ast}(x))^2dx= \sum_{m=0}^{\infty}E(\alpha_{m}-\lambda_{m}a_{m})^2=\sum_{m=0}^{\infty}\{\alpha_{m}^2(1-\lambda_{m})^2+\frac{1}{n}\lambda^2_{m}\text{var}(\phi_{m}(x))\}. \end{equation} I have been trying to prove this. My approach is \begin{equation*} \begin{split} E\int (f(x)-f^{\ast}(x))^2dx & = \int E(f(x)-f^{\ast}(x))^2dx\; [by Fubini]\\ & = \int E\left[\sum_{m=0}^{\infty}(\alpha_{m}-\lambda_{m}a_{m})\phi_{m}(x)\right]^2dx\\ & = \int E\left(\left[\sum_{m=0}^{\infty}(\alpha_{m}-\lambda_{m}a_{m})^2\phi_{m}(x)^2\right]+2 \sum_{m=0}^{\infty}\sum_{j=0}^{\infty}(\alpha_{m}-\lambda_{m}a_{m})(\alpha_{j}-\lambda_{j}a_{j})\phi_{m}(x)\phi_{j}(x)\right)dx\\ & = E\left(\left[\sum_{m=0}^{\infty}(\alpha_{m}-\lambda_{m}a_{m})^2\int\phi_{m}(x)^2dx\right]+2 \sum_{m=0}^{\infty}\sum_{j=0}^{\infty}(\alpha_{m}-\lambda_{m}a_{m})(\alpha_{j}-\lambda_{j}a_{j})\int\phi_{m}(x)\phi_{j}(x)dx\right)\\ & = \sum_{m=0}^{\infty}E(\alpha_{m}-\lambda_{m}a_{m})^2. \end{split} \end{equation*} Note that $\alpha_{m}=E\phi_{m}(x)$. Now, \begin{equation} \begin{split} E(\alpha_m-\lambda_m a_m)^2 & = E(\alpha_m-\alpha_m \lambda_m+\alpha_m \lambda_m+\lambda_m a_m)^2\\ & = E[\alpha_{m}(1-\lambda_m)+\lambda_m(\alpha_m-a_m)]^2\\ & = E[\alpha_{m}^2(1-\lambda_m)^2+2\alpha_{m}(1-\lambda_m)\lambda_m(\alpha_m-a_m)+\lambda_m^2(\alpha_m-a_m)^2]\\ & = \alpha_{m}^2(1-\lambda_m)^2+ 2\alpha_{m}(1-\lambda_m)\lambda_mE(\alpha_m-a_m)+\lambda_m^2E(\alpha_m-a_m)^2. \end{split} \end{equation} Since \begin{equation} \begin{split} E(\alpha_{m}-a_{m}) & = \alpha_{m}-Ea_{m}\\ & = \alpha_{m}-\frac{1}{n}\sum_{k=1}^{n}E\phi_{m}(x_k)\\ & = \alpha_{m}-\frac{1}{n}\sum_{k=1}^{n}\alpha_{m}\\ & = \alpha_{m}-\alpha_{m}=0. \end{split} \end{equation} Therefore, \begin{equation} \sum_{m=1}^{\infty} E(\alpha_{m}-\lambda_{m}a_{m})^2=\sum_{m=1}^{\infty}\alpha_{m}^2(1-\lambda_{m})^2+\lambda_{m}^2E(\alpha_{m}-a_{m})^2. \end{equation}

I was wondering is there any way to prove \begin{equation} E(\alpha_{m}-a_{m})^2=\frac{1}{n}\text{var}(\phi_{m}(x)). \end{equation}