Show $E\left(\mathbf{X}_i \otimes \mathbf{u}_i\right)=\mathbf{0}$ implies $E\left(\mathbf{X}_i^{\top}\mathbf{G}\mathbf{u}_i\right)=\mathbf{0}$

80 Views Asked by At

Let $\mathbf{X}_i$ be a $G \times K$ random matrix, and let $\mathbf{u}_i$ be a $G \times 1$ random vector, and suppose we have a sample of $i=1,\ldots,N$ of each.

Suppose the following condition holds: \begin{equation} E\left(\mathbf{X}_i \otimes \mathbf{u}_i\right)=\mathbf{0} \end{equation} where "$\otimes$" denotes the Kronecker Product.

Now show, for some arbitrary $G \times G$ symmetric matrix $\mathbf{G}$, that \begin{equation} E\left(\mathbf{X}_i^{\top}\mathbf{G}\mathbf{u}_i\right)=\mathbf{0} \end{equation} is implied by the first condition.


The first condition (the one involving the Kronecker Product) essentially says that every component of $\mathbf{X}_i$ is uncorrelated with every element of $\mathbf{u}_i$. Therefore, the argument is this: Because of the first condition, therefore any linear combination of $\mathbf{X}_i$ is uncorrelated with $\mathbf{u}_i$. This is the explanation given, and supposedly this is trivial, but I'm trying to see it mathematically.

Thanks for your help!

2

There are 2 best solutions below

2
On BEST ANSWER

I think it can be helpful to leave the matrix world temporarily to see what it means in terms of scalar variables. Let's see if you agree...

Let $\mathbf X_i$ consist of elements $x_i^{(j, k)}$ and $\mathbf u_i$ of elements $u_i^{(l)}$, where $j=1, \dots G$, $k=1, \dots, K$ and $l=1, \dots, G$. The $G^2\times K$ matrix $E(\mathbf X_i \otimes \mathbf u_i)$ consists of elements $E(x_i^{(j, k)}u_i^{(l)})$, so the condition is equal to postulating $E(x_i^{(j, k)}u_i^{(l)})=0$.

If we denote the elements of $\mathbf G$ by $g^{(m, l)}$ the product $\mathbf G \mathbf u_i$ will be $G\times 1$ with elements $\sum_{l=1}^G g^{(m, l)}u_i^{(l)}$. So then what we have is $$ E(\mathbf X_i'\mathbf G \mathbf u_i)=E\left(\begin{pmatrix}x_i^{(1, 1)} & x_i^{(2, 1)} & \dots & x_i^{(G, 1)}\\ x_i^{(1, 2)} & x_i^{(2, 2)} & \dots & x_i^{(G, 2)}\\ \vdots & \vdots & \ddots & \vdots\\ x_i^{(1, K)} & x_i^{(2, K)} & \dots & x_i^{(G, K)} \end{pmatrix}\begin{pmatrix}\sum_{l=1}^G g^{(1, l)}u_i^{(l)}\\\sum_{l=1}^G g^{(2, l)}u_i^{(l)} \\ \vdots \\ \sum_{l=1}^G g^{(G, l)}u_i^{(l)}\end{pmatrix}\right)\\ =E\left(\begin{pmatrix}\sum_{j=1}^G\sum_{l=1}^G g^{(j, l)}x_i^{(j, 1)}u_i^{(l)}\\\sum_{j=1}^G\sum_{l=1}^G g^{(j, l)}x_i^{(j, 2)}u_i^{(l)} \\ \vdots \\ \sum_{j=1}^G\sum_{l=1}^G g^{(j, l)}x_i^{(j, G)}u_i^{(l)}\end{pmatrix}\right)\\ =\begin{pmatrix}\sum_{j=1}^G\sum_{l=1}^G g^{(j, l)}E(x_i^{(j, 1)}u_i^{(l)})\\\sum_{j=1}^G\sum_{l=1}^G g^{(j, l)}E(x_i^{(j, 2)}u_i^{(l)}) \\ \vdots \\ \sum_{j=1}^G\sum_{l=1}^G g^{(j, l)}E(x_i^{(j, G)}u_i^{(l)})\end{pmatrix}=\boldsymbol{0} $$ I think this shows the "linear combination" part of the answer quite well.

0
On

Ok, then I guess when you say $E(\mathbf X_i \otimes \mathbf u_i)=0$, you mean that $0$ is a matrix ?... In this case, $\mathrm{vec}(\mathbf X_i^\top \mathbf G \mathbf u_i) = (\mathbf u_i^\top \otimes \mathbf X_i^\top)\mathrm{vec}(\mathbf G)$. So if you have $(\mathbf X_i \otimes \mathbf u_i)=0$, then obviously $\mathrm{vec}(\mathbf X_i^\top \mathbf G \mathbf u_i)=0$ for all $\mathbf G$.