Why $E[d^TMd] = tr(\Sigma M)+tr(\mu\mu^T M)$ if $d$ is normal $(\mu,\Sigma)$?

63 Views Asked by At

I am trying to derive the equation below,

$E[d^TMd] = tr(\Sigma M)+tr(uu^T M)$

where $d \sim \phi(\mu,\Sigma) $,

$\phi()$ is Gaussian distribution,

M is a n by n matrix, and d is a n by 1 vector.

since $d^TMd$ is a scalar, trace($d^TMd$) = trace($dd^TM$)

so, $E[d^TMd] = E[tr(d^TMd)] = E[tr(d d^T M)] $

and I am stuck.

1

There are 1 best solutions below

0
On BEST ANSWER

I think know how $E[dd^T]=\Sigma+\mu\mu^T$ is derived, then it's just

$E[tr(dd^TM)] = \\ tr(E[dd^TM]) =\\ tr(E[dd^T]M) =\\ tr((\Sigma + \mu\mu^T)M) = \\ tr(\Sigma M + \mu\mu^T M) = \\ tr(\Sigma M) + tr(\mu\mu^TM)$