Recursive approach to multiple linear regression models

127 Views Asked by At

Let $$ Y_i = X_i \mathbf{\beta}_i + \epsilon_i , \qquad i = 1 , ... , T, $$ be a finite number of multiple regression models indexed by $i$. Here $X_i$ is the design matrix $m \times k_i$, $\mathbf{\beta}_i = (\beta_{i,1} , ... , \beta_{i,k_i})^T$, $\epsilon_i \sim \mathcal{N}(\mathbf{0}, \sigma^2_i I_{m})$ gaussian noise, with $k_i$ (strictly) increasing as $i$ increases. Assume $\epsilon_i, \epsilon_j$ to be independent for $i \ne j$.

Denote by $C_i$ the known covariance matrix of $\epsilon_i$.

Concretely, Example. Fix $T = 2$ and consider ($k_1 = 2$ and $k_2 = 3$) $$ Y_1 = X_1 \begin{bmatrix} \beta_{11} \\ \beta_{12} \end{bmatrix} + \epsilon_1 \qquad (1)$$ $$ Y_2 = X_2 \begin{bmatrix} \beta_{11} \\ \beta_{12} \\ \beta_{13} \end{bmatrix} + \epsilon_2 \qquad (2) $$

Firstly,

  • Of course, I can batch $X_1$ and $X_2$ together to form a "single" regression problem $Y = X \beta + \epsilon$, and consider the least-square estimator that yields $\widehat{\beta} = (X^t C^{-1} X)^{-1} X^t C^{-1} Y$, where $C$ is the (block) diagonal batched covariance matrix. I think this is the best estimator I can get, but probably not insightful.

  • Another idea is to consider the estimate $\widehat{\beta}_1 = (X_1^t X_1)^{-1} X_1^t Y_1$ for (1) and then solve (2) as a restriction least-square problem where the first $k_1 = 2$ regressors are set to be equal to their estimates given by (1). Is there any gain or loss by doing so?

  • Some kind of bayesian technique I am not aware of...?

Question is: is there any smart way to use the solution of the first regression to solve the second regression?

My hope relies on the fact that the $\sigma_i^2$'s are different across the $T$ groups of observations. Perhaps it is possible to state a condition on them which guarantees some kind of enhancement of the estimate...

Any suggestion is very welcome.