Let $Y_1, Y_2,\ldots, Y_n$ be independent $N(0,1)$ random variables. Define $ X_i =\sum_{j=1}^n c_{i,j}Y_j$, $i=1,2,\ldots,n$, where $c_{i,j}$ are real constants. Show that $E(X_i\mid X_k)=\dfrac{\sum_{j=1}^n c_{i,j}c_{k,j}}{\sum_{j=1}^nc_{k,j}^2}X_k$. What is $\operatorname{Var}(X_i\mid X_k)$?
I can only think of $X$ as a transformation of $Y$, which gives me the result that $X$ is Gaussian with $N(0, CC')$. But I got no idea how to get the conditional expectation and variance.
Conditional distributions in normal families are always linear (which explains partly the ubiquity of the Gaussian distribution in modeling), for example the fact that $$X_i=\alpha_{i,k}X_k+\beta_{i,k}Z_{i,k}+\gamma_{i,k}, $$ for some deterministic coefficients $\alpha_{i,k}$, $\beta_{i,k}$ and $\gamma_{i,k}$, and for a standard normal random variable $Z_{i,k}$ independent of $X_{i,k}$ implies that $$ E(X_i\mid X_k)=\alpha_{i,k}X_k+\gamma_{i,k}, $$ and $$ \mathrm{var}(X_i\mid X_k)=\beta_{i,k}^2. $$ To identify these coefficients, note that, for every $i\ne k$, $$ E(X_i)=E(X_k)=0,\qquad\mathrm{var}(X_i)=\sum_jc_{i,j}^2, $$ and $$ \mathrm{Cov}(X_i,X_k)=\sum_jc_{i,j}c_{k,j}, $$ on the one hand, and that $$ E(X_i)=\gamma_{i,k},\qquad\mathrm{var}(X_i)=\alpha^2_{i,k}\mathrm{var}(X_k)+\beta^2_{i,k}, $$ and $$ \mathrm{Cov}(X_i,X_k)=\alpha_{i,k}\mathrm{var}(X_k), $$ on the other hand. Hence $$ \gamma_{i,k}=0,\qquad\alpha_{i,k}=\frac{\sum\limits_jc_{i,j}c_{k,j}}{\sum\limits_jc_{k,j}^2}, $$ and $$ \beta^2_{i,k}=\sum_jc_{i,j}^2-\frac{\left(\sum\limits_jc_{i,j}c_{k,j}\right)^2}{\sum\limits_jc_{k,j}^2}. $$ Considering the canonical scalar product $\langle\ ,\ \rangle$ and the tuples $c_i=(c_{ik})_k$, this can be rewritten as $$ \alpha_{i,k}=\frac{\langle c_i,c_k\rangle}{\langle c_k,c_k\rangle}, \qquad \beta^2_{i,k}=\langle c_i,c_i\rangle-\frac{\langle c_i,c_k\rangle^2}{\langle c_k,c_k\rangle}. $$