Independence between sums of $Z_k$

54 Views Asked by At

Let's assume you are faced with the following random variables $$ X_i = \sum_{k}\alpha_{i,k}Z_k + \epsilon_i$$ where $Z_k,\epsilon_i$ are i.i.d. standard normal random variables and $\alpha_{i,k} >0$ are non-null coefficients. You can think of the $Z_k$ as common risk factors and the $\epsilon_i$ as the idiosyncratic risk factors driving $X_i$. Assume you also know $$Y = \sum_k\beta_kZ_k$$ where $\beta_k >0$ are non-null coefficients. Hence, I know that we can re-express the random variables $X_i$ as a function of the variable $Y$ and a "residual part" of the $Z_k$. Precisely: \begin{eqnarray} X_i &=& \sum_{k}\alpha_{i,k}Z_k + \epsilon_i \nonumber \\ &=& \gamma(Y-Y) + \sum_{k}\alpha_{i,k}Z_k + \epsilon_i \nonumber \\ &=& \gamma Y-\gamma\sum_k\beta_kZ_k + \sum_{k}\alpha_{i,k}Z_k + \epsilon_i \nonumber \\ &=& \gamma Y + \sum_{k} \left(\alpha_{i,k} - \gamma\beta_k\right) Z_k + \epsilon_i \nonumber \end{eqnarray} This development is used in research papers like the one of Pykhtin's. However, I struggle to understand why in this final expression $$ X_i = \gamma Y + \sum_{k} \left(\alpha_{i,k} - \gamma\beta_k\right) Z_k + \epsilon_i $$ does the $Y$ is now independent from the "residual part" of the $Z_k$ (the second term, the sum over the $k$), when the $Y$ is actually a function of the $Z_k$, or am I missing something?

Any help or suggestion would be much appreciated.