I found a few threads about this but none of them answered my question.
I am supposed to show that if you have random variables $X_1$,$X_2$ that are gaussian distributed and they fulfill that $$E(X_1)E(X_2) =E(X_1 X_2)$$ which is our definition of being uncorrelated, then they are also independent.
Most answers use that in that case the matrix in a multivariate normal distribution has to be diagonal, but I don't see the relationship between $$E(X_1)E(X_2) =E(X_1 X_2)$$ and this diagonal fact?
By the way: I know that this would solve the problem, as in that case we would have that $f_{(X_1,X_2)} = f_{X_1}f_{X_2}$ which would mean that they are independent, but I still don't understand this first thing.
If $f(x_1,\ldots,x_n)$ is a joint Gaussian distribution such that $X_1,\ldots,X_n$ are uncorrelated, then their covariance matrix $\Sigma$ is diagonal. Then, expanding out the exponent $-(x-\mu)^T \Sigma^{-1} (x-\mu)$ noting $\Sigma$ is diagonal (and thus $\Sigma^{-1}$ is too) in the Gaussian distribution gives $\sum_i -(x_i - \mu_i)^2/\sigma_i^2$. Noting $e^{a+b} = e^a e^b$, we see that the joint distribution factors into a product of 1-d Gaussians with means $\mu_i$ and variances $\sigma_i^2$. Thus, uncorrelated and jointly Gaussian implies independent.
Reversing this proof gives independent Gaussian RV's form a joint Gaussian distribution which the different RV's are uncorrelated.
Note the definition of $X_i,X_j$ being uncorrelated is $E[(X_i - \mu_i) (X_j -\mu_j)]=0$. This is precisely the i,j-th element of the covariance matrix for $i \neq j$. For $i=j$, we simply get the variances on the diagonal).