variance of quadratic forms, possibly wrong formula in "Linear Regression Analysis" by George A. F. Seber, Alan J. Lee?

261 Views Asked by At

I have an example where I tried to use the formula on page 10 in "Linear Regression Analysis" by George A. F. Seber, Alan J. Lee.

They state the following: Given $X$ a vector of independent random variables with the expected value $\Theta = E(X) = \mu$, the variance of the quadratic form $X^T A X$ is given as

$Var(X^T A X) = (\mu_4 - 3 \mu_2^2)a^Ta + 2\mu_2^2tr(A A) + 4\mu_2 \Theta^T A A \Theta + 4 \mu_3 \Theta^T A a$,

where $A$ is a symmetric matrix and $a$ is a column vector of diagonal elements of $A$.

I tried this on two examples:

Let $A = \begin{pmatrix} 1 & -1 & 0 & 0 & 0 & 0 \\ -1 & 2 & -1 & 0 & 0 & 0\\ 0 & -1 & 2 & -1 & 0 & 0\\ 0 & 0 & -1 & 2 & -1 & 0\\ 0 & 0 & 0 & -1 & 2 & -1\\ 0 & 0 & 0 & 0 & -1 & 1\\ \end{pmatrix}$

$a = \begin{pmatrix} 1 \\ 2 \\ 2 \\ 2 \\ 2 \\ 1 \\ \end{pmatrix}$

$X$ with $X_i$ normally distributed with $E(X) = 0$ and $Var(X) = 1$.

$ \Theta_1 = \begin{pmatrix} 0 \\ 0 \\ 1 \\ 1 \\ 2 \\ 2 \\ \end{pmatrix}$ and $ \Theta_2 = \begin{pmatrix} 0 \\ 1 \\ 2 \\ 2 \\ 2 \\ 2 \\ \end{pmatrix}$

Therefore, $E(X + \Theta_1) = \Theta_1$ and $E(X + \Theta_2) = \Theta_2$

We have that $\Theta_1 A \Theta_1 = 2 = \Theta_2 A \Theta_2$, I would therefore expect that the variances are also equal, i.e.,

$Var((X + \Theta_1)^T A (X + \Theta_1) ) = Var((X + \Theta_2)^T A (X + \Theta_2))$, but in this example I get

$Var((X + \Theta_1)^T A (X + \Theta_1) ) = 72$ and $Var((X + \Theta_2)^T A (X + \Theta_2)) = 64$.

Does someone know, what I did wrong here?