Let $(Y_t,X_{1,t},\dots X_{K,t})$, $t=1,2,\dots$, be a jointly stationary time series on the probability space $(\Omega,\mathcal A,P)$ and taking values in $\mathbb R^{K+1}$, and assume each component to be square integrable at each $t$.
Question: Does there exists $\beta \in \mathbb R^K$ and square integrable scalar time series $\varepsilon_t$ such that
$$Y_t=X_t\beta +\varepsilon_t \quad \text{and} \quad E[X_t \varepsilon_t]=0$$
for each $t$, where $X_t:=[X_{1,t},\dots X_{K,t}]$ is a $K \times 1$ row vector?
It seems to me that the following argument works:
Define $\beta=E[X'_t X_t]^+ E[X'_tY_t] \in \mathbb R^K$, where $A^+$ denotes the Moore-Penrose inverse of matrix $A$. Because of joint stationarity, we have that $\beta$ does not depend on $t$. Moreover, from the Hilbert projection theorem in $\mathcal L^2 $ and the least-square properties of the Moore-Penrose inverse we have that
$$E[X'_tY_t]=E[X'_t X_t]\beta$$
for each $t$. Therefore if we define $\varepsilon_t:=Y_t-X_t\beta$ we get that $\varepsilon_t$ is square integrable and $E[X_t\varepsilon_t]=0$ for each $t$.
Is this correct? Thanks a lot for your help.