Expected value of random matrix multiplication

320 Views Asked by At

Let $\bf{A}\in \mathcal{C}^{M\times N}$ and $\bf{B}\in \mathcal{C}^{N\times T}$ be independent random matrices where their entries are drawn from complex Gaussian distributions of $\mathcal{CN}(0, \xi^2)$ and $\mathcal{CN}(0, \sigma^2)$, respectively.

How can we evaluate the expected value of $\mathbb{E}\{\bf{B}^*\bf{A}^*\bf{A}\bf{B}\}$?

(I guess the result is $\mathbb{E}\{\bf{B}^*\bf{A}^*\bf{A}\bf{B}\}=\xi^2\sigma^2MN\bf{I}_T$, but I failed to prove it. Here $(\cdot)^*$ denotes the conjugate transpose, and $\bf{I}_T$ is the identity matrix of size T.)

Thanks.

1

There are 1 best solutions below

1
On

Not a complete answer

What follows is not a real proof, but I suspect that it's the gist of one.

And it's really YOUR proof, not mine, based on your explanation in a comment. I've ignored the complex-ness of things here, because I can't believe that it actually adds any interesting variation.

Step 1: Normalize so that the variances are both 1, so I don't have to write greek letters. By this I mean let $U = \mathcal{CN}(0, 1)$ and then write $A = \xi U$. By linearity of expectation, you can pull $\xi^2$ out of the thing you're working with, once you rewrite it in terms of $U$. Do the same stunt for $B$. So now we've reduced the problem to once where the variances are 1.

Step 2: Observe that in your $B'A'AB$ product (I'm using primes instead of superscript stars because I'm still lazy), each entry is a sum of products of the form $b_{ij}a_{jn}a_{np}b_{ps}$. To compute the expectation of any individual such product, you need to compute a quadruple integral over the domains of the $a$s and the $b$s. By Fubini, you can rewrite that as an integral over the domains of the $a$s (for each possible point in the domains of the $b$s), and then integrate those results over the domains of the $b$s. When we're doing the $a$ integral, the outer terms are constants, so let's just look at the expectation of $a_{jn}a_{np}$ As you observed in your comment, that expectation is zero unless the subscript pairs match up, i.e., $j = n$ and $n = p$. Unless we're in one of those cases, we'll be integrating something involving $b$s multiplied by zero, so that overall expected value for those terms will all be zeros. In the special cases where $j = n$ and $n = p$, the inner expected value will be $1$, so we end up integrating something that looks like $b_{ij}a_{jn}a_{np}b_{ps} = b_{in}b_{ns}$. That integral will be zero unless $i = n$ and $n = s$, i.e., unless it's a term like $b_{ii}b_{ii}$, in which case the integral will be $1$. Those terms also happen to occur only in diagonal entries of your $B'A'AB$ matrix. And I think that just about wraps it up. Short form: Fubini, Einstein summation, linearity of expectation, applied repeatedly.

It's true that I'm assuming the hypotheses of Fubini's theorem are satisfied here, and I don't actually recall what they are offhand...but I'll bet they are. So that's a detail for you to check as well.