Let :
- $X_0$ constant
- $(U_t)_t$ mutually independent, gaussian , and $U_t \sim \mathcal{N}(0, \sigma^2)$
- $\forall t , X_{t+1} = X_t + U_{t+1}$
- $\overline{X_{T-1}} = \dfrac{1}{T} \sum_{t=0}^{T-1} X_t$
We are looking for :
- $X_T^{\star} = E(X_T | \overline{X_{T-1}} )$
- its law
- $V ( X_T | \overline{X_{T-1}} )$
My attempt :
- $T^2 \bar{X_T}^2 = ( \sum_{i=0}^{T-1} \sum_{j=0}^{i} U_j)^2= T^2 U_0 + (T-1)U_1^2 + \dots U_{T-1}^2 + \sum_{i=0}^{T-1} \sum_{j=0}^{i-1}U_i U_j$
- $T^2 E\bar{X_T}^2 = \dfrac{T(T+1)(2T+1)}{6} \sigma^2$
- $V \bar{X_T}^2 = \dfrac{(T+1)(2T+1)}{6T} \sigma^2$
- $\dfrac{ \bar{X_T} } { \sqrt{T} }= \dfrac{U_0}{ \sqrt{T} } + \mathcal{N} ( 0 , \sigma^2 \dfrac{(T-1)T (2T-1)}{6 T^3})$
The vector $U = (U_1, ..., U_T)$ is Gaussian (since $(U_i)_i$ are iid normal variables). Therefore the vector $(X_T, \bar{X}_{T-1})$ is also Gaussian since it's a linear transformation of U. Thus, your question is just a matter of conditional distributions in a gaussian vector (e.g. look at "conditional distributions" in https://en.wikipedia.org/wiki/Multivariate_normal_distribution).
Let $\Sigma$ be the covariance matrix of the vector $(X_T, \bar{X}_{T-1})$. Then, we have $$ \Sigma_{11} = Var(X_T) = T^2 \sigma^2, \quad \Sigma_{12} = cov(X_T, \bar{X}_{T-1}) = \frac{T-1}{2} \sigma^2, \quad \Sigma_{22} = Var(\bar{X}_{T-1}) = \frac{(T-1)(2T-1)}{6} \sigma^2 $$ and $$ \mu_1 = \mathbb{E}[X_T] = X_0, \quad \mu_2 = \mathbb{E}[\bar{X}_{T-1}] = X_0 $$
Therefore, using the conditional distribution formula, we obtain that $X_T|\bar{X}_{T-1}$ is normally distributed with $$ \mathbb{E}[X_T|\bar{X}_{T-1}] = X_0 + \frac{3}{2T-1} (\bar{X}_{T-1} - X_0), \quad Var[X_T|\bar{X}_{T-1}] = \left(T^2 - \frac{3(T-1)}{2(2T -1)}\right) \sigma^2 $$
NB: Please re-check the calculations as I have gone though them pretty quickly.