Theorem: Let $X \in \mathbb{R}^d$ be a Gaussian vector and $\Gamma_X$ its covariance matrix. Then $X_1,\dots,X_d$ are independent if and only if $Γ_X$ is a diagonal matrix (i.e. the variables are pairwise uncorrelated)
My attempt: One direction is easy: If they are pairwise uncorrelated then the non-diagonal entries of $\Gamma_X$ would equal $0$, being covariances of independent variables.
What troubles me is the other direction - I though about starting with a vector of $d$ independent standard normal variables and then scaling their variances by pre-multiplying the vector by a matrix and hence preserving the property of being a Gaussian vector. (Notice that WLOG we can assume the vector to have entries with mean $0$) However this doesn't look very rigorous since I end up with $2$ vectors giving the same covariance matrix with only one of them having independent entries.
For the other direction, assume your variables $\bf X$ are independent with zero means. The joint density of the $\bf X$ is proportional to $$ \exp\left(-\frac12 \mathbf{x}^T {\Gamma}^{-1}\mathbf{x}\right)\tag1 $$ where $\Gamma$ is the covariance matrix. But independence implies that the joint density must factor into a product of $d$ individual marginal densities. So the joint density must be proportional to $$ \prod_{i=1}^d\exp\left(-\frac12 x_i^2/\sigma_i^2\right)=\exp\left(-\frac12\sum_{i=1}^d x_i^2/\sigma_i^2\tag2\right) $$ Comparing (1) and (2), conclude that $\Gamma^{-1}$ is diagonal.
Another proof: Independent variables are always uncorrelated. If $X_i$ is independent of $X_j$ then $$ E(X_iX_j)=E(X_i)E(X_j) $$ which rearranges to $\operatorname{cov}(X_i,X_y):=E(X_iX_j)-E(X_i)E(X_j)=0$.