How to minimize the expectation?

1.2k Views Asked by At

Given random variables $X_0, X_1, \ldots, X_n$ with finite expectations $m_0, m_1, \ldots, m_n$ I want to prove that the numbers $a_i = \frac{\det \Lambda_{i0}}{{\det \Lambda_{00}}}$ minimise the expectation

$$\mathsf E [(X_0-m_0)+a_1(X_1-m_1)+\ldots+a_n(X_n-m_n)]^2, $$

where $\Lambda_{ij}$ denotes the adjunct matrix for the covariance matrix $\Lambda=[\lambda_{ij}]_{i,j=0,\ldots,n}$ at the position $(i,j)$.

I tried to play with Lagrange multipliers but the computations are terrible. This is usually presented as a fact without proof.

1

There are 1 best solutions below

0
On BEST ANSWER

Define $$f(a_1,\dots,a_n):=\mathsf E \left[\left(X_0-m_0\right)+a_1\left(X_1-m_1\right)+\ldots+a_n\left(X_n-m_n\right)\right]^2;$$ this can be rewritten in terms of $\lambda_{i,j}$, namely, $$f(a_1,\dots,a_n)=\lambda_{0,0}^2+2\sum_{j=1}^n\lambda_{0,j}a_j+\sum_{i,j=1}^n\lambda_{i,j}a_ia_j. $$
We have to find the critical points of $f$. Notice that $$\frac{\partial f}{\partial a_n}=2\lambda_{0,n} +2a_n\lambda_{n,n} +2\sum_{i=1}^{n-1}\lambda_{i,n}a_i,$$ and by symmetry, $$\frac{\partial f}{\partial a_j} =2\lambda_{0,j}+2 \sum_{i=1}^n\lambda_{i,j}a_i,$$ hence the critical points $(a_1,\dots,a_n)$ have to satisfy $\Lambda(1,a_1,\dots,a_n)^T=0$. When then covariance matrix $\Lambda$ is invertible, this gives the wanted expression for $a_i$'s by Cramer's rule.

In order to check that $(a_1,\dots,a_n)$ is indeed a minimum, one can check that the Hessian matrix is positive definite (since it is actually equal to $2\Lambda)$.