Checking exercise 3.3.3 from R. Vershynin
Let G be an m × n Gaussian random matrix, i.e. the entries of G are independent N(0, 1) random variables. Let u ∈ Rn be a fixed unit vector. Then Gu ∼ N(0, Im)
I went about it by multiplying Matrix G by vector u which ends up with a vector of dimension mx1 of the entries $$ \begin{matrix} <G11,u1> & +\cdots+ & <G1n, un> \\ \vdots & \ddots & \vdots \\ <Gm1,u1> & +\cdots+ & <Gmn, un> \\ \end{matrix} $$
I can show that the vector Gu has an Expected value of zero since Gij ∼ N(0, 1) and uj is a constant. However, I'm struggling to show that the variance of Gu is the Identity matrix of dimension m. Since I end up with the variance of the sum of the variances of Gi in each row multiplied by the squared sum of uj. Can someone help me with this ?
First, note that $(Gu)_{i}=\sum_{j}G_{ij}u_{j}$. Denoting by $\delta_{ij}$ the Kronecker delta, we have \begin{multline*} \mathbb{E}\left[(Gu)_{i}(Gu)_{i^{\prime}}\right]=\mathbb{E}\left[\left(\sum_{j}G_{ij}u_{j}\right)\left(\sum_{j}G_{i^{\prime}j}u_{j}\right)\right]=\mathbb{E}\left[\sum_{jj^{\prime}}G_{ij}G_{i^{\prime}j^{\prime}}u_{j}u_{j^{\prime}}\right]\\ =\sum_{jj^{\prime}}\mathbb{E}\left[G_{ij}G_{i^{\prime}j^{\prime}}\right]u_{j}u_{j^{\prime}}=\sum_{jj^{\prime}}\delta_{ii^{\prime}}\delta_{jj^{\prime}}u_{j}u_{j^{\prime}}=\delta_{ii^{\prime}}\sum_{j}u_{j}^{2}=\delta_{ii^{\prime}}\Vert u\Vert^{2}=\delta_{ii^{\prime}} \end{multline*} as desired.