Now I have questions about the following variants of least square problem
If $A$ is a $ m \times n$ matrix(which need not be full rank, but can assume $m \geq n$) and let $b \in \mathbb{R}^m$. Also assume $|| . ||$ be an arbitrary norm in $\mathbb{R}^m$. Show that there is a vector $x \in \mathbb{R}^n$ such that $ ||Ax -b||$ is minimized.
If $||.||$ is given by standard 2-norms, then it becomes least square problem. Can we assure the existence of such vectors for arbitrary norms?
I tried to use compact argument for continuity of the quantity we want to minimize, but boundedness of domain always bothers me. Can you help me?
You can replace $A$ with a matrix that has orthogonal columns of length $1$. Then, because all norms are equivalent, you get that the length of $Ax$ under your norm is greater than $C||x||_2$ for some fixed constant $C > 0$. Thus, if $||x||_2$ is too large then $Ax$ will be far away from $b$. So you can assume you are minimizing your function over a compact domain, so a minimum exists.