I understand that if the columns of a $x \times n$ square matrix $A$ are linearly independent. Then we can construct a vector $x$, such that $Ax = e_i$, $e_i$ is a unit basis. Thus we can construct a matrix $X$, such that $A X = [e_1, e_2, \cdots, e_n] = I$. However, to prove the matrix $A$ is invertible, we also need to have $X A = I$. And i don't know how to prove $X A = I$. Thanks.
===============
To be more specific, I want to prove the following claims. Suppose $A=[a_{11}, a_{12}, \dots, a_{1n}; a_{21}, a_{22}, \dots, a_{2n}; \dots; a_{m1}, a_{m2}, \dots, a_{mn}]$, and matrix $X$ in the same struct, and we know that
$ \sum_{i = 1}^n a_{ji} x_{ik} = 1, \text{if} \quad j = k $
$ \sum_{i = 1}^n a_{ji} x_{ik} = 0, \text{if} \quad j \neq k $
And we are going to show that
$ \sum_{i = 1}^n x_{ji} a_{ik} = 1, \text{if} \quad j = k $
$ \sum_{i = 1}^n x_{ji} a_{ik} = 0, \text{if} \quad j \neq k $
Any idea about how to prove this?
We know that If a linear map $L$ is invertible ,$\text{Ker}(L)={0} $ Let $L: R^n \to R^n $ $L(v)=Av$ and $ v=\begin{pmatrix} x_1 \\ \vdots \\ x_n \end{pmatrix}$
Let E_k be a k th row of A
$Av=x_1E_1+...+x_nE_n $ also note that
Av=0 $\Rightarrow $ v=0
v=0 $\Rightarrow $ $x_1=...=x_n =0$ which means $E_1,...,E_n$ are linearly independent
You can prove other direction with similar way