Prove relative error with condition number of matrix inequality

986 Views Asked by At

I was working on some questions and solutions, and encountered the following question. I am able to prove the inequality using the given information and some algebraic manipulation but the "within $O(y^2)$ accuracy" part I don't get.$\newcommand\norm[1]{\|#1\|}\newcommand\cond{\operatorname{cond}}$

Question: Say that we solve $Ux=b$, with U being represented as $U_2$ on the computer and b lacks any representation error. Additionally, there aren't any error arising from computing. In other words, $x_2$ satisfies $U_2\cdot x_2=b$. Show that, if $y=\norm{U}\norm{U-U_2}$, then within $O(y^2)$ accuracy:

$$\frac{\norm{x-x_2}}{\norm x} \le \cond(U) \frac{\norm{U-U_2}}{\norm U}$$

As I said, I can prove the inequality but I don't understand the within O(y^2) accuracy part and what is required for me to incorporate that into the proof.

Any guidance or help would be greatly appreciated. Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

The standard perturbation inequality when perturbing only the matrix of the linear system is (in your notation) $$ \frac{\|x-x_2\|}{\|x\|}\leq \frac{\epsilon}{1-\epsilon}, \qquad \epsilon=\mathrm{cond}(U)\frac{\|U-U_2\|}{\|U\|}, $$ assuming that $\epsilon<1$ (I guess there's a mistake or a typo in your definition of $y$, it should be $y=\|U^{-1}\|\|U-U_2\|<1$ instead).

Considering the function $f(\epsilon)=\epsilon/(1-\epsilon)$, you can make its Taylor expansion at $0$: $f(\epsilon)=\epsilon+O(\epsilon^2)$. Hence $$ \frac{\|x-x_2\|}{\|x\|}\leq \epsilon+O(\epsilon^2) = \mathrm{cond}(U)\frac{\|U-U_2\|}{\|U\|} + O(\epsilon^2). $$