Let $X_k = (X_{k, t})_{t\geq 0}$ a Gaussian process for each $k\in [m]$, and $a\in\mathbf{R}^m$. Is $Y = (Y_t)_{t\geq 0} = (\sum_{k = 1}^m a_k X_{k,t})_{t\geq 0}$ still a Gaussian process? Each $X_k$ is defined on the same probability space, and I supposedly need to use the fact that a linear transformation of a Gaussian vector is still Gaussian.
Let $0\leq t_1 < ... < t_n$ belong to $\mathbf{R}$. I know $(X_{k,t_l})_{l\leq n}$ is Gaussian for each $k\in [m]$, and I need to show $(Y_{t_l})_{l\leq n}$ is Gaussian. I tried to write it as a linear transformation of a Gaussian vector, but I'm not sure how, since I have $m$ Gaussian vectors and not just one.
I also tried to prove it directly from the definition, taking $c\in \mathbf{R}^n$, and proving the linear combination $\sum_{l = 1}^n c_lY_{t_l}$ is Gaussian, but I'm not sure all of the $X_{k, t_l}$'s are jointly Gaussian, just those with the same $k$.
This is lemma 2.23 of Arguin's A First Course in Stochastic Processes.
It is easier to show first for two Gaussian processes, namely $X_1$ and $X_2$. So the resulting process is $Y = X_1 + X_2$ (there is no need to a factor $a$, since if we multiply a Gaussian process by a factor it is still Gaussian). The PDF of this process is: $$ f_Y(Y) = \int \limits_{-\infty}^\infty f_{X_1,X_2}(Y-X_2,X_2) \, \mathrm{d}y$$ So assuming that joint PDF has joint-Normal distribution, one can calculate a resulting PDF, which is Gaussian (it is a bit long calculus, so it is not shown here, there might be an easier way with a Characteristic funciton calculation).
Next, one can proceed by induction for more dimensions.