Can I infer $AB=C$ from $AB \vec r = C \vec r$?

69 Views Asked by At

Can I infer $AB=C$ from $AB \vec r = C \vec r$? where $\vec r$ is $n \times 1$ vector($n$ rows and $1$ column) and each entry is from $\{0,1\}$.

Usually checking $AB = C$ takes $O(n^3)$ algorithm. But then we can use randomized algorithm to reduce complexity. So, in the book, he is checking $AB \vec r = C \vec r$ for some random vector $\vec r$. But my doubt is: is it possible that $AB \vec r = C \vec r$ satisfies but $AB \neq C$? When it happens? Any example? kindly elaborate.

2

There are 2 best solutions below

4
On BEST ANSWER

Strictly speaking, all that the statement $(AB)\vec{r} = C \vec{r}$ allows you to say is that $\vec{r}$ is in the null space of $AB - C$ (since the above equation implies that $(AB - C)\vec{r} = 0$.) This does not imply that $AB - C = 0$; there are certainly matrices that are not identically zero with a non-trivial nullspace. For example, let $$ A = B = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad C = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}, \quad \vec{r} = \begin{bmatrix} 1 \\ 0 \end{bmatrix}.$$ In this case, we have $AB \vec{r} = C \vec{r}$, but $AB \neq C$.

This said, it is also true that the probability that a randomly selected real-valued vector from our vector space will be in the nullspace of a given matrix is either 0 or 1. This is because the nullspace of a given matrix is a linear subspace of the full vector space. If the nullspace is the entire vector space, then a randomly chosen vector is guaranteed to be in the nullspace; but in this case, the matrix is identically zero, and so (in this case) $AB = C$. On the other hand, if $AB \neq C$, then the nullspace of $AB - C$ is a proper subspace of the full vector space. Since the measure of this subspace within the full vector space is 0, almost all non-zero real vectors will have $AB \vec{r} \neq C \vec{r}$.

Thus, calculating $AB \vec{r}$ and $C \vec{r}$ for some random real vector can give us very strong evidence as to whether the two matrices are equal. If $AB \vec{r} \neq C \vec{r}$, then we know that $AB \neq C$; if $AB \vec{r} = C \vec{r}$, it is either the case that $AB = C$, or that $AB \neq C$ and we were exceptionally unlucky in our choice of $\vec{r}$.

This said, the constraint that the random vector consists of 0’s and 1’s actually may raise the probability of “being unlucky” in this way. For example, in my example above, the chance of being unlucky (assuming that we know enough not to pick the zero vector) is 1/3, since picking $(1,0)$ would lead us to the wrong conclusion while any vector with a second component of 1 would let us conclude that $AB \neq C$. So I’m not sure what the author’s real argument is.

0
On

Since $r$ may contain $0$'s, it will ignore corresponding columns when doing matrix multiplication. E.g. here are 2 different matrices which will produce the same result:

$$ \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ \end{bmatrix} \begin{bmatrix} 1 \\ 0 \\ \end{bmatrix} = \begin{bmatrix} 1 \\ 3 \\ \end{bmatrix} $$

$$ \begin{bmatrix} 1 & 3 \\ 3 & 5 \\ \end{bmatrix} \begin{bmatrix} 1 \\ 0 \\ \end{bmatrix} = \begin{bmatrix} 1 \\ 3 \\ \end{bmatrix} $$ If columns of matrices can be linearly dependent, then even if we take $r$ which doesn't contain $0$'s we still may end up with the same result for different matrices:

$$ \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ \end{bmatrix} \begin{bmatrix} 1 \\ 2 \\ \end{bmatrix} = \begin{bmatrix} 5 \\ 11 \\ \end{bmatrix} $$

$$ \begin{bmatrix} 5 & 0 \\ 11 & 0 \\ \end{bmatrix} \begin{bmatrix} 1 \\ 2 \\ \end{bmatrix} = \begin{bmatrix} 5 \\ 11 \\ \end{bmatrix} $$