Projection of Gaussian distribution along a vector.

8.9k Views Asked by At

Can anyone help me understand how to compute the projection of a 2D gaussian distribution along a vector. I intuitively realize that the projection will result in a 1D Gaussian, but I want to be sure. Can someone help me understand/show a proof/direct me to a proof where a 2D gaussian projected along a vector gives a line.

Eg. Consider a Gaussian $\mathbf{X} \sim N (\mu,\Sigma)$ where $\mu = [3,2]^T$ and $\Sigma = \begin{bmatrix} 4 & 0 \\ 0 & 7 \end{bmatrix}$, what is the projection along the vector $v = 2i + 4j$ ?

Any help would be much appreciated!! Thanks

2

There are 2 best solutions below

1
On

See https://en.wikipedia.org/wiki/Multivariate_normal_distribution where it is stated that a multi-variate distribution is multi-variate normal if and only if every linear combination of the variables is normally distributed. If I understand correctly, your "projection" defines a linear combination that you are interested in of the variables, so that is indeed normal. Let me know if you meant something else by "projection" though.

2
On

Let $\mathbf{x}\sim\mathcal{N}(\mu_x, \Sigma_x)$ be an $n$-dimensional Gaussian distribution. Then, if $y=\mathbf{v}^\top\mathbf{x}$, where $\mathbf{v}\in\Bbb{R}^n$, it holds that $$ y\sim\mathcal{N}(\mu_y, \sigma_y^2), $$ where $\mu_y=\mathbf{v}^\top\mu_x$ and $\sigma_y^2=\mathbf{v}^\top\Sigma_x\mathbf{v}$, since $$ \mu_y=\Bbb{E}[y]=\Bbb{E}[\mathbf{v}^\top\mathbf{x}]=\mathbf{v}^\top\Bbb{E}[\mathbf{x}]=\mathbf{v}^\top\mu_x $$ and $$ \sigma_y^2 = \Bbb{E}[(y-\mu_y)^2] = \Bbb{E}[(\mathbf{v}^\top\mathbf{x}-\mathbf{v}^\top\mu_x)^2] = \Bbb{E}[(\mathbf{v}^\top(\mathbf{x}-\mu_x))^2] = \Bbb{E}[\mathbf{v}^\top(\mathbf{x}-\mu_x)\mathbf{v}^\top(\mathbf{x}-\mu_x)] = \Bbb{E}[\mathbf{v}^\top(\mathbf{x}-\mu_x)(\mathbf{x}-\mu_x)^\top\mathbf{v}] = \mathbf{v}^\top\Bbb{E}[(\mathbf{x}-\mu_x)(\mathbf{x}-\mu_x)^\top]\mathbf{v} = \mathbf{v}^\top\Sigma_x\mathbf{v}. $$