I have a set $X = \left \{ X_i | i \in (1,n) \wedge X_i \text{ is a random variable} \right \} $
Does $\forall i \in (1,n ), X_i \text{ follows a normal distribution} $ implies that $\exists b \in \mathbb{R} \text{ s.t. covariance(X)} = bI_{n} \Leftrightarrow X_i \text{ are i.i.d.}$ ?
If so, what other commonly used distributions have this property?
I know that if x and y are two independent random variables, then their covariance is zero, but the converse is not generally true.