What is the name of this problem? linear Matrix equation optimization?!

155 Views Asked by At

I have almost no knowledge in linear algebra but I need to understand the process of solving a problem. In fact I'm looking for some keywords or hints to know what exactly should I be Googling! So any help is appreciated in advance.

The problem is very simple I suppose: I have a scalar ${D}$ which is calculated by the following l2 norm matrix equation:

$D=||A_{m \times m}X_{m \times 1}+B_{m \times 1}||_2$

I want to optimize $D$ by finding the best $X$ that minimizes it.

i.e.:

$X^*=argmin_XD$

Unfortunately I don't remember the name of this problem so I am not able to find a proper lesson on the YouTube. Anyone can help me?

1

There are 1 best solutions below

0
On BEST ANSWER

If $A$ is invertible, then see the @Jonh 's comment.

If $A$ is not invertible, then $(1):$ $im(A^TA)=im(A^T)$. Let $f:x\rightarrow ||Ax+b||^2$. If $f$ admits an extremum in $x$, then its gradient in $x$ is $0$, that is $A^T(Ax+b)=0$ or $(2):$ $(A^TA)x=A^T(-b)$. According to $(1)$, such an $x$ exists; if $x_0$ is a particular solution of $(2)$, then the set of all the solutions is $x_0+y$ where $y\in\ker(A)=\ker(A^TA)$. Note that $f(x_0+y)=f(x_0)$ is a fixed real -which is the required minimum of $f$-.

Conclusion. A value of $x_0$ is $x_0=-(A^TA)^+A^Tb$ where $Z^+$ is the Moore-Penrose pseudoinverse of $Z$. cf. https://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_pseudoinverse