What happens to the dependance of the columns of a matrix once manipulated?

36 Views Asked by At

Lets say that we have a square matrix ( n x n ), and Ax=B has only one solution (where B ∈ ).

We can tell from the fact that Ax=B has only one solution that the columns of A will be linearly Independent (as they will have pivots in every column), however what happens to this when we transpose A (keeping with the fact that is A is a square matrix)?

Further, looking at another type of manipulation, what would happen to the dependance of the columns if we squared A?

I'm trying to really grasp how the dependance of the columns of a matrix are affected, so if anyone thinks of other common examples feel free to include them.

1

There are 1 best solutions below

2
On BEST ANSWER

The columns of a square matrix are linearly independent if and only if $A$ is invertible. So, any transformation preserving invertibility won't affect independance of colums.

For example:

  • obviously, if you replace $A$ by $PAQ$, where $P,Q$ are two invertible matrices, you will still get an invertible matrix

  • if $A$ is invertible, so are $A^t$ and $A^n$ for $n\geq 1$.

  • a bit less obvious: for any polynomial $P$ coprime to the characteristic polynomial of $A$, $P(A)$ will be invertible.

Proof. Since $P$ is coprime to $\chi_A$ (the characteristic polynomial of $A$), you have a Bezout relation $UP+ V\chi_A=1$. Evaluation at $A$+ Cayley-Hamilton theorem yields $U(A)P(A)=I_n$, so $P(A)$ is invertible.