$$ \begin{bmatrix}1&2&3\\4&5&6\\7&8&9\end{bmatrix}X = \begin{bmatrix}3&2&1\\6&5&4\\9&8&7\end{bmatrix}. $$ I know how to do this normally, but I'm confused with this case. It seems like an inverse does not exist for the matrices, and the second is the first matrix in reverse.
Solving for a matrix
75 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
We have a linear matrix equation in $\mathrm X \in \mathbb R^{3 \times 3}$
$$\mathrm A \mathrm X = \mathrm B$$
where
$$\mathrm A = \begin{bmatrix} 1&2&3\\ 4&5&6\\ 7&8&9\end{bmatrix} \qquad \qquad \qquad \mathrm B = \begin{bmatrix} 3&2&1\\ 6&5&4\\ 9&8&7\end{bmatrix}$$
Visual inspection tells us that a particular solution, $\mathrm X_p$, is a permutation matrix. What is the null space of $\mathrm A$? Using SymPy:
>>> from sympy import *
>>> A = Matrix([[1,2,3],
[4,5,6],
[7,8,9]])
>>> A.nullspace()
[Matrix([
[ 1],
[-2],
[ 1]])]
Hence, the $1$-dimensional null space of $\mathrm A$ is spanned by
$$\mathrm v := \begin{bmatrix} 1\\ -2\\ 1\end{bmatrix}$$
and the solution set is a $3$-dimensional affine matrix space parametrized as follows
$$\left\{ \mathrm X_p + \mathrm v \eta^{\top} : \eta \in \mathbb R^3 \right\} = \Bigg\{ \mathrm X_p + \begin{bmatrix} \eta_1 & \eta_2 & \eta_3\\ -2 \eta_1 & -2 \eta_2 & -2 \eta_3\\ \eta_1 & \eta_2 & \eta_3\end{bmatrix} : \eta \in \mathbb R^3 \Bigg\}$$
Note that
$$\mathrm A (\mathrm X_p + \mathrm v \eta^{\top}) = \underbrace{ \mathrm A \mathrm X_p }_{= \mathrm B} + \underbrace{\mathrm A \mathrm v}_{= 0_3} \eta^{\top} = \mathrm B + \mathrm O_3 = \mathrm B$$
Recall that, when we multiply matrices $A$ and $B$, each column of $AB$ is a linear combination of the columns of $A$, with weights given by entries in the corresponding column of B.
In this case, you want the first column of $X$ to be something that says, "ignore the first two columns of $A$, and take 1 times the third. That column should be $\left[\begin{matrix} 0 \\ 0 \\ 1 \end{matrix}\right]$. Can you take it from there?
[EDIT]
This solution won't be unique, since, as we've noted, the columns of $A$ are linearly dependent, so there's more than one way to get column 3.
[EDIT]
To obtain a general solution, we could work with the augmented matrix $[A|AX]$, and see what happens:
$\begin{align}\left[\begin{array}{ccc|ccc}1 & 2 & 3 & 3 & 2 & 1 \\ 4 & 5 & 6 & 6 & 5 & 4 \\ 7 & 8 & 9 & 9 & 8 & 7\end{array}\right] &\sim \left[\begin{array}{ccc|ccc}1 & 2 & 3 & 3 & 2 & 1 \\ 4 & 5 & 6 & 6 & 5 & 4 \\ 0 & 0 & 0 & 0 & 0 & 0\end{array}\right] \\ &\sim \left[\begin{array}{ccc|ccc}1 & 2 & 3 & 3 & 2 & 1 \\ 0 & 1 & 2 & 2 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0\end{array}\right] \\ &\sim \left[\begin{array}{ccc|ccc}1 & 0 & -1 & -1 & 0 & 1 \\ 0 & 1 & 2 & 2 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0\end{array}\right]\end{align}$
Now we just have to interpret this result. For each column we get a free variable. Call one free variable $t_1$. Then the first column of $X$ is $\left[\begin{matrix}-1+t_1 \\ 2- 2t_1 \\ t_1 \end{matrix}\right]$. Similarly, the second column is $\left[\begin{matrix}t_2 \\ 1- 2t_2 \\ t_2 \end{matrix}\right]$.
Do you see how this is working?