A question on vector norm error analysis

110 Views Asked by At

${x^*} \in {R^n}$ is the optimal solution of an optimization problem and leads to the minimum objective function ${\left\| {A{x^*} - b} \right\|^2}$, where $A \in {R^{m \times n}}$ and $b \in {R^m}$. But I can only obtain an approximately optimal solution denoted as ${x^*} + \Delta x$ which leads to the objective function ${\left\| {A\left( {{x^*} + \Delta x} \right) - b} \right\|^2}$. Now I can already bound the difference of the two objective functions by ${\Delta \lambda }$:

${\left\| {A\left( {{x^*} + \Delta x} \right) - b} \right\|^2} - {\left\| {A{x^*} - b} \right\|^2} \le \theta \left| {\Delta \lambda } \right|$, where $\theta $ is a constant.

My question is can I bound $\left\| {\Delta x} \right\|$ or $\frac{{{{\left\| {\Delta x} \right\|}^2}}}{{{{\left\| {{x^*}} \right\|}^2}}}$ by some function of ${\Delta \lambda }$, $A$ and $b$, i.e., ${\left\| {\Delta x} \right\|^2} \le f\left( {\Delta \lambda ,A,b} \right)$ or $\frac{{{{\left\| {\Delta x} \right\|}^2}}}{{{{\left\| {{x^*}} \right\|}^2}}} \le f\left( {\Delta \lambda ,A,b} \right)$.

Or anyone can recommend some relevant materials to me?

Thanks!

1

There are 1 best solutions below

2
On BEST ANSWER

I’ll assume that $\|\cdot \|$ is the standard Euclidean norm, that is $\|\ t\|=\sqrt{(t,t)}$ for each $t\in\Bbb R^k$.

Put $\Delta y=A\cdot \Delta x$. If the matrix $A$ has full column rank then $A^+A=I$, where $A^+$ is a pseudoinverse of the matrix $A$. Then $\Delta x=A^+\Delta y$, which provides a bound $\|\Delta x\|\le \|A^+\|_2\cdot\|\Delta y\|$, where $\|A^+\|_2$ is the operator norm of the matrix $A^+$ (see a formula for $\|A\|_2$ here).

We can bound $\|\Delta y\|$ as follows. Put $c=Ax^*-b$. The value $\|c\|$ is bounded, for instance by $\| b\|$ (see better bounds here). We have $\|\Delta y\|\le \|c\|+\|\Delta y+c\|$ and $\|c+\Delta y\|^2-\|c\|^2\le \theta \left| {\Delta \lambda } \right|$. It follows $$\|\Delta y\|\le \|c\|+\sqrt{ \|c\|^2+\theta \left| {\Delta \lambda } \right|}.$$