minimizing vectors

2.2k Views Asked by At

Could please check whether my solution is right?
Q. $$A=\begin{pmatrix} 1 & 1 \\ 1 & 1 \\ 0 & 0 \\ \end{pmatrix}$$Find the set of vectors $x$ that minimize the value $$|| A x- \begin{pmatrix} 1 \\ 2 \\ 3 \\ \end{pmatrix}||$$

My solution.
$Ax-(1,2,3)^T=0$ (inconsistent) → least square method
$A^TAx=A^Tb$, $b=(1,2,3)^T$
$$ \begin{pmatrix} 2 & 2 \\ 2 & 2 \ \end{pmatrix}\begin{pmatrix} x_1 \\ x_2 \\ \end{pmatrix}=\begin{pmatrix} 3 \\ 3 \\ \end{pmatrix}$$

$2x_1+2x_2=3$ → $x_2=-x_1+1.5$
Therefore the set of $x$ is {$ \begin{pmatrix} k \\ -k+1.5 \\ \end{pmatrix}$| k∈ℝ }

2

There are 2 best solutions below

0
On BEST ANSWER

Problem statment

Given the matrix $$ \mathbf{A} = \left[ \begin{array}{cc} 1 & 1 \\ 1 & 1 \\ 0 & 0 \\ \end{array} \right] \in \mathbb{C}^{m\times n}_{\rho} $$ and the data vector $$ b = \left[ \begin{array}{cc} 1 \\ 2 \\ 3 \\ \end{array} \right] $$ find the least squares solution $$ x_{LS} = \left\{ x\in\mathbb{C}^{n} \colon \lVert \mathbf{A} x - b \rVert_{2}^{2} \text{ is minimized} \right\} $$

Solution

The general solution to the least squares problem is $$ x_{LS} = \color{blue}{\mathbf{A}^{+} b} + \color{red}{ \left( \mathbf{I}_{n} - \mathbf{A}^{+} \mathbf{A} \right) y}, \quad y \in \mathbb{C}^{n} \tag{1} $$ where colors distinguish $\color{blue}{range}$ and $\color{red}{null}$ spaces.

The simple structure of the matrix $\mathbf{A}$ encourages a solution using the pseudoinverse built from the singular value decomposition (SVD).

Singular value decomposition

The SVD is straightforward: $$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} \tag{2} $$

Resolve the eigensystem for $$ \mathbf{W} = \mathbf{A}^{*} \mathbf{A} = \left[ \begin{array}{cc} 2 & 2 \\ 2 & 2 \\ \end{array} \right] $$

Singular values

The eigenvalues are $$ \lambda \left( \mathbf{W} \right) = \left\{ 4, 0 \right\} $$ The matrix $\mathbf{A}$ has rank $\rho=1$, and the lone singular value is $$ \sigma_{1} = \sqrt{\lambda_{1}} = 2 $$ The matrix of singular values is $$ \mathbf{S} = \left[ \begin{array}{c} 2 \end{array} \right] $$ and the sabot matrix is $$ \Sigma = \left[ \begin{array}{c} \mathbf{S} & 0 \\ \mathbf{0} & \mathbf{0} \end{array} \right] = \left[ \begin{array}{c} 2 & 0 \\ 0 & 0 \\ 0 & 0 \\ \end{array} \right] $$

Domain matrix $\mathbf{V}$

The normalized eigenvectors of $\mathbf{W}$ are the column vectors of $\mathbf{V}$:

$$ \mathbf{V} = \frac{1}{\sqrt{2}} \left[ \begin{array}{cr} \color{blue}{1} & \color{red}{-1} \\ \color{blue}{1} & \color{red}{1} \\ \end{array} \right] $$

Domain matrix $\mathbf{U}$

Equation $(2)$ can be rewritten to provide the $\color{blue}{range}$ space vector for $\mathbf{U}$:

$$ \mathbf{U}_{1} = \frac{1}{\sqrt{2}} \color{blue}{\left[ \begin{array}{r} 1 \\ 0 \\ -1 \end{array} \right]} $$ For this simple problem, we can eyeball the $\color{red}{null}$ space vectors. $$ \mathbf{U} = \left[ \begin{array}{ccc} % \frac{1}{\sqrt{2}} \color{blue}{\left[ \begin{array}{r} 1 \\ 0 \\ -1 \end{array} \right]} & % \frac{1}{\sqrt{2}} \color{red}{\left[ \begin{array}{r} 1 \\ 0 \\ 1 \end{array} \right]} & % \color{red}{\left[ \begin{array}{c} 0 \\ 1 \\ 0 \end{array} \right]} % \end{array} \right] $$

Pseudoinverse

The pseudoinverse matrix is $$ \mathbf{A}^{+} = \mathbf{V} \, \Sigma^{+} \mathbf{U}^{*} = \frac{1}{4} \left[ \begin{array}{ccc} 1 & 1 & 0 \\ 1 & 1 & 0 \\ \end{array} \right] $$

Least squares solution

Equation $(1)$ provides $$ x_{LS} = \color{blue}{\mathbf{A}^{+} b} + \color{red}{ \left( \mathbf{I}_{n} - \mathbf{A}^{+} \mathbf{A} \right) y} = \color{blue}{ \frac{3}{4} \left[ \begin{array}{ccc} 1 \\ 1 \\ \end{array} \right] } + \color{red}{ \left[ \begin{array}{rr} 1 & -1 \\ -1 & 1 \\ \end{array} \right] y } , \quad y \in \mathbb{C}^{2} $$

Plot

The following plot shows the least squares merit function $\lVert \mathbf{A} x - b \rVert_{2}^{2} $ as a function of $x$. The white dot represents $\color{blue}{x_{LS}}$, the dashed, yellow line $\color{red}{x_{LS}}$.

merit

0
On

I would use a more "basic" method.

Writing the vector x as $\begin{pmatrix}x \\ y \end{pmatrix}$, $Ax= \begin{pmatrix}1 & 1 \\ 1 & 1 \\ 0 & 0\end{pmatrix}\begin{pmatrix} x \\ y \end{pmatrix}= \begin{pmatrix} x+ y \\ x+ y \\ 0 \end{pmatrix}$ so that $Ax- \begin{pmatrix}1 \\ 2 \\ 3 \end{pmatrix}= \begin{pmatrix}x+ y- 1 \\ x+ y- 2 \\ -3\end{pmatrix}$.

The norm of that is $\sqrt{(x+ y- 1)^2+ (x+ y- 2)^2+ 9}$. Writing that as f(x,y), we have $f_x= (1/2)((x+ y- 1)^2+ (x+ y- 2)^2+ 9)^{-1/2}[(2(x+y-1)+ 2(x+ y- 2)]= 0$ and $f_y= (1/2)((x+ y- 1)^2+ (x+ y- 2)^2+ 9)^{-1/2}[(2(x+y-1)+ 2(x+ y- 2)]= 0$. Those both reduce to $2[(x+ y- 1)+ 2(x+ y- 2)]= 4x+ 4y- 6= 0$. The function will be minimized at any point along the line $x+ y= \frac{3}{2}$. That is, $y= \frac{3}{2}- x$ which is exactly what you have.