Joint Probability Factorization Into Product of Independents Via Eigenvectors

81 Views Asked by At

I am having a little trouble understanding an equation from Bishop's Pattern Recognition And Machine Learning. The exact equation (2.56), shown below:

(2.56):

is derived by substituting the following identities into the Multivariate Gaussian:

(2.55):

(2.50) and (2.51):

Where lambda are eigenvalues and u are the eigenvectors. I understand how plugging in (2.51) expands the equation into a product of independent distributions, but what confuses me is the denominator of (2.56). Shouldn't it be a product of all the eigenvalues as defined by (2.55)? Or perhaps I am reading one of the equations wrong? Would love to get the correct understanding of this passage as I've been confused over it all morning. Thanks!

1

There are 1 best solutions below

1
On BEST ANSWER

It is a product of all of the eigenvalues, as you have suggested. From equation 2.55 you have, $$ p(\textbf{y}) = \prod_{j=1}^N \frac{1}{(2 \pi \lambda_j)^{1/2}} \exp\left\{ -\frac{y_j^2}{2 \lambda_j} \right\} $$ which is the same as, $$ p(\textbf{y}) = \frac{1}{(2 \pi)^{N/2} \left( \prod_{j=1}^N \lambda_j^{1/2} \right)} \exp\left\{ -\sum_{i=1}^N \frac{y_i^2}{2 \lambda_i} \right\} $$