Let $(\Omega,\Sigma)$ be a measurable space and $m,n\in\mathbb{N}$. Let $X\colon (\Omega,\Sigma)\rightarrow (\mathbb{R}^m,\mathcal{B}(\mathbb{R}^m))$ be a random variable. As usual, here $\mathcal{B}(\mathbb{R}^m)$ denotes the Borel algebra. Assume that $X$ is either the zero map or a multivariate Gaussian random variable with mean vector $\mu\in \mathbb{R}^m$ and covariance matrix $K_{XX}\in \mathbb{R}^{m^2}$. Next, let $f\colon \mathbb{R}^m\rightarrow \mathbb{R}^n$ be an $\mathbb{R}$-linear map.
Now, clearly the composition $f\circ X$ is again measurable, since $f$ is continuous. What can you say about the distribution of the composition $f\circ X$? Is it again either the zero map or a Gaussian? If so, what are its mean and covariance (in case it is Gaussian)?
As a side remark: In a paper I have read, the composition $f\circ X$ is called pushforward of a random variable along a linear map. Is this standard terminology?
Let $n$ be a positive integer. If $Z=(Z_1,..., Z_n)$ is a random vector whose components have finite second moments, the mean vector and covariance matrices are defined: \begin{align} E[Z] &= (E[Z_1], ..., E[Z_n])\\ K_Z &= E[(Z-E[Z])(Z-E[Z])^{\top}] \end{align}
Let $k$ be another positive integer. If $M$ is any (constant) $k \times n$ matrix and $c$ is any (constant) vector of size $k$, then the random vector $W=MZ+c$ has components with finite second moments and has mean and covariance matrix: \begin{align} E[W] &= ME[Z]+c\\ K_W &= MK_ZM^{\top} \end{align} which can be proven directly from the definitions.
The transformation $Z\rightarrow MZ$ is called a linear transformation or linear function of $Z$. The transformation $Z\rightarrow MZ+c$ is called an affine transformation or affine function of $Z$.
A random vector $Z$ is defined to be jointly Gaussian if and only if it has the form $Z=AG+b$ for some rectangular matrix $A$, some constant vector $b$, and some random vector $G=(G_1,..., G_m)$ with i.i.d. $N(0,1)$ components (for some positive integer $m$). Then $K_Z=AA^{\top}$, and if $K_Z$ is invertible then $Z$ has the standard jointly Gaussian PDF. Conversely, any random vector $Z$ that has the standard jointly Gaussian PDF can be put into the standard form $Z=AG+b$ and hence is jointly Gaussian.
Note that the always-zero vector counts as "jointly Gaussian" under this definition because $0=0G+0$. A more stringent definition of "jointly Gaussian" requires the covariance matrix to be invertible (so the joint PDF can be defined).
By the basic definition (which does not require an invertible covariance matrix), if $Z$ is jointly Gaussian then any affine transformation of $Z$ is also jointly Gaussian because: $$ MZ+c=M(AG+b)+c=(MA)G+(Mb+c)$$ where $MA$ is yet another rectangular matrix and $Mb+c$ is another constant vector.
If you assume $Z$ has an invertible covariance matrix and you require $MZ+c$ to have an invertible covariance matrix, then you will want $M$ to have full rank (so $MK_ZM^{\top}$ is invertible).
For example, a corner case is if $Z$ is a $N(0,1)$ random variable and $M$ is the $2 \times 1$ matrix $M=[1;1]$. Then $MZ=[Z;Z]$ is not the all-zero vector but also does not have an invertible covariance matrix because the $2 \times 2$ matrix $MM^{\top}$ is not invertible. Then $MZ$ is jointly Gaussian under the basic definition, but does not satisfy the more stringent condition of having an invertible covariance matrix.