Outter product isn't backward stable. This is because output matrix most likely has rank one and thus can't be represented in the form $(x + \delta x)(y + \delta y)^*.$ Also we know that if output dimension more then input one then a problem is rarely backward stable. In this case $m + n > m n,$ given $m, n > 1.$
Question is why we can't represent matrix in the way pointed above? I can't find detailed proof. Thanks.
EDIT. So, let's start from the def. Algo is backward stable if $$\tilde{f}(x) = f(\tilde{x}), \exists \tilde{x}, \|\tilde{x} - x\| = \|x\| \times O(\epsilon_{machine}) .$$ Consider single element of resultant matrix:
LHS is $\tilde{f}(x_i y_j) = \mbox{fl}(x_i)\otimes\mbox{fl}(y_j) = [x_i(1 + \epsilon_1) \times y_j(1 + \epsilon_1)](1 + \epsilon_2).$
RHS is $f(\tilde{x_i} \tilde{y_j}) = \mbox{fl}(x_i)\times\mbox{fl}(y_j) = x_i(1 + \epsilon_1) \times y_j(1 + \epsilon_1).$
So, what is the problem with rank one?
Ok, i've figured it out.
The thing is that it is exteremely unlikely for $\tilde{A}$ be of rank one, because every $\otimes$ introduces small error, e.g. $\frac{ \mbox{fl}(x_i)\otimes \mbox{fl}(y_j) }{ \mbox{fl}(x_i)\otimes \mbox{fl}(y_{j+1}) } \ne \frac{y_j}{y_{j+1}} $, so $(j+1)$-column no more multiple of $j$-column.