I want to calculate the sum of two probability density functions. I know that it is:
$P_{U+V} (x)= (P_{U} * P_{V})(x)$
If $P_{U}$ and $P_{V}$ are gaussian functions in one dimension, i.e. $P_{U}(x) = \mathcal{N}(\mu_{U},\sigma_{U}^{2})$ and $P_{V}(x) = \mathcal{N}(\mu_{V},\sigma_{V}^{2})$, then $P_{U+V} (x)= \mathcal{N}(\mu_{U} + \mu_{V},\sigma_{U}^{2}+\sigma_{V}^{2})$.
But what is the result if $P_{U}$ and $P_{V}$ are two dimensional gaussian functions? I don't find anywhere the result of this convolution.
If $P_U=\mathcal N(M_U,C_U)$ and $P_V=\mathcal N(M_V,C_V)$ then $P_U\ast P_V=\mathcal N(M_U+M_V,C_U+C_V)$. Here the means $M$ are vectors of size $n\times 1$ and the covariances $C$ are $n\times n$ symmetric matrices.
The shortest route might be to use the fact that $P_X=\mathcal N(M,C)$ if and only if, for every $x$ in $\mathbb R^n$, $E[\exp(\mathrm i\langle x,X\rangle)]=\exp(\mathrm i\langle x,M\rangle-\frac12\langle x,Cx\rangle)$ and the fact that $P_U\ast P_V$ is the distribution of $U+V$ when $U$ and $V$ are independent.
Edit: Recall that if $X$ and $Y$ are independent random variables with PDF $f_X$ and $f_Y$ respectively then the PDF $f_Z$ of $Z=X+Y$ is given by $$ f_Z(z)=\int f_X(x)f_Y(z-x)\mathrm dx. $$