Least Squares solution for a symmetric singular matrix

811 Views Asked by At

I want to solve this system by Least Squares method:$$\begin{pmatrix}1 & 2 & 3\\\ 2 & 3 & 4 \\\ 3 & 4 & 5 \end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} =\begin{pmatrix}1\\5\\-2\end{pmatrix} $$ This symmetric matrix is singular with one eigenvalue $\lambda1 = 0$, so $\ A^t\cdot A$ is also singular and for this reason I cannot use the normal equation: $\hat x = (A^t\cdot A)^{-1}\cdot A^t\cdot b $. So I performed Gauss-Jordan to the extended matrix to come with $$\begin{pmatrix}1 & 2 & 3\\\ 0 & 1 & 2 \\\ 0 & 0 & 0 \end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} =\begin{pmatrix}1\\3\\-1\end{pmatrix} $$ Finally I solved the $\ 2x2$ system: $$\begin{pmatrix}1 & 2\\\ 0 & 1\end{pmatrix}\begin{pmatrix}x\\y\end{pmatrix} =\begin{pmatrix}1\\3\end{pmatrix} $$ taking into account that the best $\ \hat b\ $ is $\begin{pmatrix}1\\3\\0\end{pmatrix}$

The solution is then $\ \hat x = \begin{pmatrix}-5\\3\\0\end{pmatrix}$

Is this approach correct ?

EDIT

Based on the book 'Lianear Algebra and its applications' from David Lay, I also include the Least Squares method he proposes: $(A^tA)\hat x=A^t b $

$$A^t b =\begin{pmatrix}5\\9\\13\end{pmatrix}, A^tA = \begin{pmatrix}14 & 20 & 26 \\ 20 & 29 & 38 \\ 26 & 38 & 50\end{pmatrix}$$ The reduced echelon from the augmented is: $$ \begin{pmatrix}14 & 20 & 26 & 5 \\ 20 & 29 & 38 & 9 \\ 26 & 38 & 50 & 13 \end{pmatrix} \sim \begin{pmatrix}1 & 0 & -1 & -\frac{35}{6} \\ 0 & 1 & 2 & \frac{13}{3} \\ 0 & 0 & 0 & 0 \end{pmatrix} \Rightarrow \hat x = \begin{pmatrix}-\frac{35}{6} \\ \frac{13}{3} \\ 0 \end{pmatrix}$$ for the independent variable case that $z=\alpha , \alpha=0 $

5

There are 5 best solutions below

0
On BEST ANSWER

Your solution is incorrect for the following reason. When you perform the Gauss-Jordan elimination, you transform the original system $$\tag{1} Ax=b $$ to another $$\tag{2} SAx=Sb. $$ But the least squares solutions of (1) and (2) do not coincide in general. Indeed, the least squares solution of (1) is $A^{+}b$, the least squares solution of (2) is $(SA)^{+}Sb$. If $A$ is invertible, then $$(SA)^{+}S=(SA)^{-1}S=A^{-1}=A^{+}$$ and everything is OK, but in general case $(SA)^{+}S\ne A^{+}$.

In your case, in particular, $$ S=\left(\begin{array}{rrr} 1 & 0 & 0 \\ 2 & -1 & 0 \\ 1 & -2 & 1 \\ \end{array}\right),\quad (SA)^{+}S=\left(\begin{array}{rrr} -11/6 & 4/3 & 0 \\ -1/3 & 1/3 & 0 \\ 7/6 & -2/3 & 0 \\ \end{array}\right), $$ $$ A^{+}=\left(\begin{array}{rrr} -13/12 & -1/6 & 3/4 \\ -1/6 & 0 & 1/6 \\ 3/4 & 1/6 & -5/12\\ \end{array}\right).$$

You can calculate the pseudoinverse matrix by using the rank factorization: $$ A=BC,\quad B=\left(\begin{array}{rr} 1 & 3\\ 2 & 4\\ 3 & 5\\ \end{array}\right),\quad C=\left(\begin{array}{rrr} 1&1/2&0\\ 0&1/2&1 \end{array}\right) $$ (this decomposition comes from the fact the second column of $A$ is the arithmetic mean of the remaining columns). It remains only to calculate the pseudoinverse matrix $$ A^{+}=C^{+}B^{+}=C^T(CC^T)^{-1}(B^TB)^{-1}B^T $$ and the least squares solution is $A^{+}b$.

0
On

I think you did the Gaussian elimination wrong.

$$\begin{pmatrix}1 & 2 & 3\\\ 2 & 3 & 4 \\\ 3 & 4 & 5 \end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} =\begin{pmatrix}1\\5\\-2\end{pmatrix}$$

Becomes $$\begin{pmatrix}1&2&3\\0&-1&-2\\0&-2&-4\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix}=\begin{pmatrix}1\\3\\-5\end{pmatrix}$$ Which becomes $$\begin{pmatrix}1&2&3\\0&-1&-2\\0&0&0\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix}=\begin{pmatrix}1\\3\\-11\end{pmatrix}$$Now notice that the final row says $0=-11$. That is a contradiction, hence there are no solutions to this equation.

0
On

The RLS solution is given by $$ \hat x = A^+ \, b$$ where $A^+$ is the pseudo inverse of $A$.

As $A$ is not full rank, it is not possible effectively to calculate it with the simple formula you mentioned.
However, it is still possible to calculate it with numerical solutions, for example based on the SVD.

By doing so, we get:

$$ \hat x = \begin{pmatrix}-41/12\\-1/2\\29/12\end{pmatrix} $$

For a value $A \, \hat x$ equal to:

$$ A \, \hat x = \begin{pmatrix}17/6\\4/3\\-1/6\end{pmatrix} $$

Your method does not seem to work. I can only give my interpretation of what happens:

The issue is your choice of the "best" $\hat b$ value. The vector that you considered is not directly related to the real $b$ vector, but to something obtained after some manipulations on the rows of the linear system matrix.

Difficult in this situation to rely it to a selection of a "best" $\hat b$ vector

0
On

Note that $$\begin{pmatrix}3&4&5\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix}=-2$$and $$\Big[2\cdot\begin{pmatrix}2&3&4\end{pmatrix}-\begin{pmatrix}1&2&3\end{pmatrix}\Big]\begin{pmatrix}x\\y\\z\end{pmatrix}=\begin{pmatrix}3&4&5\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix}=2\times 5-1=9$$since this leads to $-2=9$, therefore the solution space is infeasible.

0
On

Since the matrix has the eigenvector $\pmatrix{1&-2&1\cr}^t$ with eigenvalue $0$, one has

$$\begin{pmatrix}1 & 2 & 3\\\ 2 & 3 & 4 \\\ 3 & 4 & 5 \end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} =\begin{pmatrix}1 & 2 & 3\\\ 2 & 3 & 4 \\\ 3 & 4 & 5 \end{pmatrix}\begin{pmatrix}x+t\\y-2t\\z+t\end{pmatrix}$$

for all $t$, so there is a least squares solution with $z=0$, which makes it a least squares solution for

$$\begin{pmatrix}1 & 2 & 3\\\ 2 & 3 & 4 \\\ 3 & 4 & 5 \end{pmatrix}\begin{pmatrix}x\\y\\0\end{pmatrix} \approx\begin{pmatrix}1\\5\\-2\end{pmatrix} \text{ or, equivalently, } \begin{pmatrix}1 & 2 \\\ 2 & 3 \\\ 3 & 4 \end{pmatrix}\begin{pmatrix}x\\y\end{pmatrix} \approx\begin{pmatrix}1\\5\\-2\end{pmatrix}$$

That makes it a regular least squares problem with solution $\pmatrix{x&y\cr}^t=\pmatrix{-35/6&13/3\cr}^t$, so the solutions for the orig1nal problem are $$\begin{pmatrix}-\frac{35}6 +t\\\frac{13}3-2t\\t\end{pmatrix}$$