A question about variance-covariance matrix.

178 Views Asked by At

If $X$ is a vector of random variables such that no element of $X$ is a linear combination of the remaining elements[i.e. there do not exist $a(\neq0)$ and b such that $a'X=b$ for all values of $X=x$],then $Var(X)$ is a positive-definite matrix.

This is a Theorem from George A.F. Seber's book Linear Regression Analysis. The proof is as follow:

For any vector $c$,we have $\;c'Var[X]c=Var[c'X]\ge0\;$,the equality hold iff $c'X$ is a constant.

My question is that the two statement 'no element of $X$ is a linear combination of the remaining elements' and 'there do not exist $a(\neq0)$ and b such that $a'X=b$ for all values of $X=x$' in the theorem are equivalent?

1

There are 1 best solutions below

0
On

Actually not. The second statement says that no linear combination of the variables of $X$ is a constant variable (not zero, not any other constant). This is the same as saying that the variables in $X$ together with the r.v. which is $1$ with probability $1$ are linearly independent.

For instance, if $Y$ has, say, a normal standard distribution, then $Y$ and $Y+1$ are linearly independent ($Y$ is not a linear combination of $Y+1$—that is, a scalar multiple—nor is the opposite the case). But the set $\{Y,Y+1,1\}$ is in fact linearly dependent, since, for instance, $Y=1\cdot (Y+1)+(-1)\cdot 1$.

In any case, if $\vec X=(Y,Y+1)$, then $Var(\vec X)$ would be positive semi-definite but not definite. Indeed, $$var(Y)=var(Y+1)=\sigma^2,$$ and also $$cov(Y,Y+1)=cov(Y,Y)=var(Y).$$

So $$Var(\vec X)=\left(\begin{matrix}\sigma^2 &\sigma^2 \\ \sigma^2 &\sigma^2 \\\end{matrix}\right).$$

After all, what the property means is that definiteness of the covariance matrix is equivalent to the probability density or mass of probability of $\vec X$ not to be entirely contained in any line, plane or hyperplane (passing through the origin or not).