Multiplication of normal distributed variables with orthonormal matrix

834 Views Asked by At

Let $A\in\mathbb{R}^{n\times n}$ be an orthonormal matrix, $X_1,\ldots, X_n\sim N(0,\sigma^2)$ and independent. I want to show, that $Y_1,\ldots,Y_n$ are again $N(0,\sigma^2)$ distributed and independent, where $Y=AX$.

So $$Y_i=\sum_{j=1}^n a_{ij}X_j$$ is $$N(0,\sum_{j=1}^n a^2_{ij}\cdot\sigma^2)$$ distributed, but $a_i^Ta_i=\sum_{j=1}^n a^2_{ij}=1$, since $(a_{i1},\ldots, a_{in})$ is a orthonormal vector. Is this right so far?

Thanks in advance!

1

There are 1 best solutions below

10
On BEST ANSWER

Pdf of $X=(X_1,X_2,\ldots,X_n)$ is

$$f_X(x)=\frac{1}{(\sigma\sqrt{2\pi})^n} e^{-\frac{1}{2\sigma^2}\sum x_i^2}\quad,\,x\in\mathbb R^n$$

You are transforming $X\to (Y_1,\ldots,Y_n)=Y$ such that $Y=AX$ where $A$ is orthogonal.

Clearly, $$x\in\mathbb R^n\implies y\in\mathbb R^n$$

Moreover, $$\sum_{i=1}^n y_i^2=y^T y=x^T (A^T A)x=x^T x=\sum_{i=1}^n x_i^2$$

By change of variables, pdf of $Y$ is

$$f_Y(y)=f_X(A^{-1}y)|\det(A^{-1})|=\frac{1}{(\sigma\sqrt{2\pi})^n} e^{-\frac{1}{2\sigma^2}\sum y_i^2}\quad,\,y\in\mathbb R^n$$

Hence proved.