In (Kay 1993, pg. 326), a theorem is stated, which allows to find the weight parameters of a linear model
$$ \bf{x} = \bf{G}\bf{\eta} + \bf{w} $$
where $\bf{\eta} \sim \mathcal{N}(\mu_\eta, \bf{C}_\eta)$ and $\bf{w} \sim \mathcal{N}(\bf{0},\bf{C}_w)$.
As far as I understand, all the vectors in $\bf{G}$ are deterministic and known. Let's say $\bf{G} = [\bf{s}~\bf{H}]$, where $\bf{H}$ is known and $\bf{s}$ a deterministic but unknown vector. Is there a way that I can estimate it? We can assume that $\bf{s}$ is orthogonal to $\bf{H}$. Also, consider $\eta = [\alpha~\theta^{T}]^{T}$, where $\alpha$ is a scalar.
Note 1: Origin of the problem
I have a dataset of observations $\bf{X} = [\bf{X}_c~\bf{X}_n]$ where I know the following
\begin{equation} \forall \bf{x} \in \bf{X_c},\, \bf{x}=\alpha\bf{s} + \bf{w}\\ \forall \bf{x} \in \bf{X_n},\, \bf{x}=\alpha\bf{s} + \bf{H}\theta + \bf{w} \end{equation}
Note that $\alpha$ and $\bf{s}$ are unknown. I know the first line is just a particular case of the second, where $\theta=\bf{0}$. So, my thought is that I could do away with having two models and just unify them. My initial thought was to consider the weights as random and use a Bayesian approach. I can identify a somewhat similar model to (Kay 1993) if I consider $\eta = [\alpha~\theta^{T}]^{T}$ and that kind of settles the estimation of $\alpha$. I can also consider $\bf{G} = [\bf{s}~\bf{H}]$, but since $\bf{s}$ is unknown, I'll need to estimate that.
Note 2: MVU estimator of weights
A closed-form minimum variance unbiased estimator of $\eta$ is
$$ \hat{\eta}_{\text{MVU}} = (\bf{G}^T \bf{C}_w^{-1} \bf{G} )^{-1}\bf{G}^{T}\bf{C}_w^{-1} \bf{x} $$
with $\bf{C}_w$ the noise covariance. This is the direct consequence of applying Theorem 10.3 in (Kay 1993) and assuming that the prior for $\eta$ is non-informative. I was thinking maybe to iteratively estimate $\bf{s}$ by alternation i.e. estimate $\eta$ first, then $\bf{s}$, and repeat.