I have a system that can be well described as a linear relation between input/output samples. The description is in (explicitely and in matrix notation):
$$ Y[m] = \sum_{n=0}^{N} c_n x[(m-1)N + n] \\ \mathbf{y} = \mathbf{X} \mathbf{c} $$
Here, $x$ denotes input samples, $c_n$ the coefficients of the linear system which should be found and $Y$ the output.
The problem is: An unknown (time-varying) offset is added which is different every time I send data through the system. So the more precice way to describe this system is:
$$ \mathbf{y} = \mathbf{X} \mathbf{c} + d_i $$
where $i$ denotes the $i$th measurement. Now I want to create a model of this system that "ignores" the unknown offset.
I can enforce a zero-mean system output with
$$ \mathbf{\tilde{y}} = \mathbf{y}-\operatorname{mean}(\mathbf{y}) $$
With this "corrected" output I could solve for the unknown coefficients via least-squares:
$$ \mathbf{\hat{c}} = \mathbf{X}^{\dagger} \mathbf{\tilde{y}} \\ \mathbf{\hat{y}} = \mathbf{X} \mathbf{\hat{c}} $$
The problem is
$\mathbf{\hat{y}}$ is not necessarily mean-free whereas $\mathbf{\tilde{y}}$ is by definition. Hence the system output $\mathbf{\tilde{y}}$ and the model output $\mathbf{\hat{y}}$ is not the same any more!
I could subtract the mean also from $\mathbf{\hat{y}}$ but then the model is not linear any more
I could also estimate an affine model that includes a constant. But this constant will be different any time I send data through the system due to the unknown additive constant
Basically what I want is a linear model $M$ such that:
$$ \mathbf{\hat{y}} = M\{ x\} $$ and $\| \mathbf{\tilde{y}} - \mathbf{\hat{y}} \|_2$ or $\| \mathbf{y} - \mathbf{\hat{y}} \|_2$ is minimum regardless of $d_i$.