there are similar questions asking about the bivariate case, but I am interested in a general case where n > 2.
Let's say we have $X_1,..., X_i$ iid $N(0,1)$, and for $j=1,...,n$:
$$\sum_{i=1}^j X_i = X_1+X_2+...+ X_j =Y_j $$
First I want to find the joint distribution of Y's.
I understand that each $Y_j$ ~ $N(\mu_1 + ...+\mu_j, \sigma^2_1+...\sigma^2_j)$ since the $X_i$'s are independent.
My idea is that the joint distribution of $Y_1,...Y_n$ ~ $MVN(0,\Sigma)$ where for the example of n=3:
$$ \Sigma= \begin{bmatrix} var(Y_1,Y_1) & cov(Y_1,Y_2) & cov(Y_1,Y_3) \\ cov(Y_2,Y_1) & var(Y_2,Y_2) & cov(Y_2,Y_3) \\ cov(Y_3,Y_1) & cov(Y_3,Y_2) & var(Y_3,Y_3) \\ \end{bmatrix} = \begin{bmatrix} 1 & 1 & 1 \\ 1 & 2 & 2 \\ 1 & 2 & 3 \\ \end{bmatrix} =\sigma_i^2\mathbf{I} $$
i.e. $cov(Y_j,Y_{j+1})=cov(X_1+...+X_j,X_1+...+X_i+X_{j+1})=var(X_1)+...+var(X_j)$
because $cov(X_s,X_{s+1})=0$ for independent $X_s$'s
If I apply the formulas I've seen, then the conditional distribution of $Y_{j+1}$ given $Y_j$ would have
$ \mu=\mu_{j+1}+\Sigma_{j,(j+1)}\Sigma^{-1}_{jj}(\textbf{y}_j-\mu_j)$
and covariance $\Sigma_{j,j+1}=\Sigma_{jj}-\Sigma_{j,j+1}^T\Sigma_{j+1,j+1}^{-1}\Sigma_{j,j+1}$
However, I'm having a hard time figuring out what that would look like all spelled out for this special case where $Y_j=\sum_{i=1}^j X_i $. Any pointers would be appreciated, thanks!