I need help with this question- it’s been bugging me for a while and I can’t figure it out.
$A,B \sim N_{[0,1]}$ independent random variables.
the solution asserts that $X(s)$ and $X(t)$ are jointly gaussian because they’re a linear transformation of $(A,B)^T$ which is a random gaussian vector, why are jointly gaussians implying that $X(t)|X(s) \sim N$?
moreover, I can’t figure out how they computed the variance for $X(t)|X(s)$.
$$\sigma^2 = \text{Var}(X(t)) - \frac{\text{CoV}(X(t),X(s))^2}{\text{Var}(X(s))} = \mathbb{E}(X^2(t)) - \frac{(E(X(t)X(s))^2}{\mathbb{E}[X^2(s)]}$$ what formula is used here? from that point I can complete the calculation but I don’t understand it’s beginning and the reasoning behind the conditional distribution being a gaussian. any help is massively appreciated!
If $X$ and $Y$ are bivariate normal with mean zero, variances $\sigma^2_X$, $\sigma^2_Y$, and covariance $\sigma_{XY}$, then they can be written as linear combinations of standard normal random variables $U$ and $V$ like so: \begin{align} X &= \frac{\sigma_{XY}}{\sigma_Y} U + \sqrt{\sigma_X^2 - \frac{\sigma_{XY}^2}{\sigma_Y^2}} V \\ Y &= \sigma_Y U \end{align} with the coefficients chosen to produce the correct variances and covariance.
From this transformed representation, it's now obvious from the independence of $U$ and $V$ that the conditional distribution of $X$ given $Y$ (which is equivalent to the conditional distribution of $X$ given $U$) is: \begin{align} X|Y &\sim N\left(\frac{\sigma_{XY}}{\sigma_Y}U,\sigma_X^2-\frac{\sigma_{XY}^2}{\sigma_Y^2}\right) \\ &=N\left(\frac{\sigma_{XY}}{\sigma_Y^2} Y,\sigma_X^2-\frac{\sigma_{XY}^2}{\sigma_Y^2}\right) \end{align}
If you take $X=X(t)$ and $Y=Y(s)$, the variance expression here matches your expression for the variance $\sigma^2$.