Minimum variance solution to a linear system

321 Views Asked by At

For an under-determined linear system $\boldsymbol{Ax} = \boldsymbol{b}$ the Moore-Penrose pseudo-inverse $\boldsymbol{A^+}$ can be used to find the solution $\boldsymbol{x}_\text{L2} = \left\{x_1,x_2,\dots,x_n\right\}$ with the minimum Euclidean norm $||\boldsymbol{x}||_2$ among all the solutions:

$$\boldsymbol{x}_\text{L2} = \boldsymbol{A}^+\boldsymbol{b}$$

The Euclidean norm selects the solution which minimises $\sum_{i=1}^{N} x_i^2$. However this question concerns the case where the solution that minimises $\sum_{i=1}^{N} (x_i - \mu)^2$ is sought where $\mu = \frac{1}{N} \sum_{i=1}^{N} x_i$. Is there a way to calculate this 'minimum variance' solution without resorting to numerical optimisation?

Consider an example that motivates the search for a minimum variance solution:

$$\boldsymbol{Ax} = \boldsymbol{b} = \begin{bmatrix} 1/2 &1/2 &0\\ 0 &1/2 &1/2 \end{bmatrix} \begin{bmatrix} 1\\ 1\\ 1 \end{bmatrix} = \begin{bmatrix} 1\\ 1 \end{bmatrix} $$

Say that $\boldsymbol{x}$ is unknown and is to be calculated then solutions are of the form

$$ \boldsymbol{x} = \begin{bmatrix} 0\\ 2\\ 0 \end{bmatrix} + \begin{bmatrix} 1\\ -1\\ 1 \end{bmatrix} \omega $$

for $\omega \in \mathbb{R}$. The minimum L2-norm solution is:

$$ \boldsymbol{x}_\text{L2} = \boldsymbol{A}^+\boldsymbol{b} = \begin{bmatrix} 4/3 &-2/3\\ 2/3 &2/3\\ -2/3 &4/3 \end{bmatrix} \begin{bmatrix} 1\\ 1 \end{bmatrix} = \begin{bmatrix} 2/3\\ 4/3\\ 2/3 \end{bmatrix} $$

However in this case the minimum variance solution would give the original vector of ones however how can this be calculated?