Let $\mathcal{N}(\mathbf{x}; \boldsymbol{\mu}, \mathsf{\Sigma})$ be a multivariate Gaussian Probability Density Function (PDF), so
\begin{equation} \mathcal{N}(\mathbf{x};\boldsymbol{\mu},\mathsf{\Sigma}) := \frac{1}{\sqrt{\det2\pi\mathsf{\Sigma}}} \exp\Big(-\frac{1}{2}(\mathbf{x} - \boldsymbol{\mu})^{\top}\mathsf{\Sigma}^{-1}(\mathbf{x} - \boldsymbol{\mu})\Big), \end{equation}
where $\mathbf{x}, \boldsymbol{\mu} \in \mathbb{R}^m$ and $\mathsf{\Sigma} \in \mathbb{R}^{m,m}$ is Symmetric Positive Definite (SPD). Let $\mathbf{Y}_1$ and $\mathbf{Y}_2$ be two random vectors with the following joint PDF
\begin{equation*} p_{\mathbf{Y}_{1}, \mathbf{Y}_{2}}(\mathbf{y}_{1}, \mathbf{y}_{2}) = \mathcal{N} \left( \begin{bmatrix} \mathbf{y}_1 \\ \mathbf{y}_2 \end{bmatrix} ; \begin{bmatrix} \mathbf{0}_n \\ \mathbf{0}_m \end{bmatrix} , \begin{bmatrix} \mathsf{\Sigma}_{11} & \mathsf{\Sigma}_{12} \\ \mathsf{\Sigma}_{21} & \mathsf{\Sigma}_{22} \end{bmatrix} \right) \cdot \end{equation*}
Furthermore, let $\boldsymbol{\mathcal{E}}$ be a Gaussian noise vector with PDF $p_{\boldsymbol{\mathcal{E}}}(\boldsymbol{\epsilon})=\mathcal{N}(\boldsymbol{\epsilon}; \mathbf{0}, \sigma_{\epsilon}^2\mathsf{I_n})$, which is independent of $\mathbf{Y}_1$ and $\mathbf{Y}_2$. Define $\mathbf{Z}_1=\mathbf{Y}_1+\boldsymbol{\mathcal{E}}$. The question is to find the joint PDF $p_{\mathbf{Z}_{1}, \mathbf{Y}_{2}}(\mathbf{z}_{1}, \mathbf{y}_{2})$. I have been told that the answer is
\begin{equation*} p_{\mathbf{Z}_{1}, \mathbf{Y}_{2}}(\mathbf{z}_{1}, \mathbf{y}_{2}) = \mathcal{N} \left( \begin{bmatrix} \mathbf{z}_1 \\ \mathbf{y}_2 \end{bmatrix} ; \begin{bmatrix} \mathbf{0}_n \\ \mathbf{0}_m \end{bmatrix} , \begin{bmatrix} \mathsf{\Sigma}_{11}+\sigma_{\epsilon}^2\mathsf{I} & \mathsf{\Sigma}_{12} \\ \mathsf{\Sigma}_{21} & \mathsf{\Sigma}_{22} \end{bmatrix} \right)\cdot \end{equation*}
My Thoughts
By the marginalization property of Gaussians, we can conclude $p_{\mathbf{Y}_1}(\mathbf{y}_1)=\mathcal{N}(\mathbf{y}_1; \mathbf{0}_n,\mathsf{\Sigma}_{11})$ and $p_{\mathbf{Y}_2}(\mathbf{y}_2)=\mathcal{N}(\mathbf{y}_2; \mathbf{0}_m,\mathsf{\Sigma}_{22})$. I know that by the convolution theorem, the PDF for the sum of two independent random vectors is the convolution of their PDF. Furthermore, I know that the convolution of two Gaussians is a Gaussian. This implies that $\mathbf{Z}_1$ is Gaussian. We can also obtain its mean by linearity of expectation
\begin{align*} \mathbb{E}(Z_{1,i}) &= \mathbb{E}(Y_{1,i}) + \mathbb{E}(\mathcal{E}_{1,i}) = 0 + 0 = 0, \end{align*}
and its covariance by bilinearity and symmetry of covariance as
\begin{align*} \text{cov}(Z_{1,i},Z_{1,j}) &= \text{cov}(Y_{1,i},Y_{1,j}) + \text{cov}(\mathcal{E}_{i},\mathcal{E}_{j}) + 2 \text{cov}(Y_{1,i},\mathcal{E}_{j}) \\ &= \text{cov}(Y_{1,i},Y_{1,j}) + \sigma_{\epsilon}^2\,\delta_{ij} + 0 \\ &= (\mathsf{\Sigma}_{11})_{ij} + \sigma_{\epsilon}^2\,\delta_{ij}, \end{align*}
where $\delta_{ij}$ is the Kronecker's delta. So, we have $p_{\mathbf{Z}_1}(\mathbf{z}_1)=\mathcal{N}(\mathbf{z}_1; \mathbf{0}_n,\mathsf{\Sigma}_{11}+\sigma_{\epsilon}\mathsf{I}_n)$. But I don't know how to calculate $p_{\mathbf{Z}_{1}, \mathbf{Y}_{2}}(\mathbf{z}_{1}, \mathbf{y}_{2})$.
Suppose $X:\Omega \to \mathbb{R}^n$ is normally distributed with covariance matrix $\Sigma$ and $\varepsilon:\Omega \to \mathbb{R}^m,m<n$ is normally distributed with covariance matrix $\Gamma$ and independent of $X$. Then, $Y=X+(\varepsilon,0_{n-m})$ is normally distributed with covariance matrix
$$\Sigma^*=\Sigma+\begin{bmatrix}\Gamma&0_{m\times n-m}\\ 0_{n-m\times m}&0_{n-m\times n-m}\end{bmatrix}=\Sigma +\Gamma_0$$
To see this, let $\xi \in \mathbb{R}^n$ and calculate the moment generating function of $Y$ as follows
$$\begin{aligned} M_{Y}(\xi)=E[e^{\xi^{\top}Y}]&=E[e^{\xi^{\top}X+\xi^{\top}(\varepsilon,0_{n-m})}]\\ &=E[e^{\xi^{\top} X}]E[e^{\xi_1\varepsilon_1 +...+\xi_m \varepsilon_m}]\\ &=E[e^{\xi^{\top} X}]E[e^{(\xi_1,...,\xi_m)^{\top} \varepsilon}]\\ &=e^{\xi^{\top}\Sigma \xi/2}e^{(\xi_1,...,\xi_m)^{\top}\Gamma (\xi_1,...,\xi_m)/2}\\ &=e^{\xi^{\top}\Sigma \xi/2}e^{\xi^{\top} \Gamma_0 \xi/2}\\ &=e^{\xi^{\top}(\Sigma + \Gamma_0)\xi/2}\\ &=e^{\xi^{\top}\Sigma^*\xi/2}\end{aligned},$$
where the formula for the moment generating function of a multivariate Gaussian is given here.