Preliminary contradictions with regard to the matrix method of solving linear systems of equations

40 Views Asked by At

I am an amateur and rather new to matrices. Partly due to my failure to understand the concepts more thoroughly, I have run into a problem that fails to resolve itself despite sincere effort on my part. The problem is likely preliminary. I will elucidate it as follows:

As per my understanding of the matrix method of solving linear systems of equations is correct (I have only had practice with square matrices) and as per my prescribed textbook:

If A is a singular matrix, then |A| = 0,

In this case, we calculate (adjA)B.

If (adjA)B not equal to O(null matrix), then solution does not exist and the system of equations is called consistent.

If (adjA)B is equal to O(null matrix), then system may either be consistent or inconsistent according to whether the system has infinitely many solutions or no solution.

This, I presume to be true.

However, I have seemingly managed to reach a spurious result that I elaborate below:

For any square matrix A,

A(adjA) = (adjA)A = |A|I (Theorem)

Now, let AX = B and |A| = 0.

Now, here let (adjA)B !=(not equal) O.

Premultiplying by A on both sides,

A(adjA)B != AO

A(adjA)B != O

But A(adjA) = |A|I (From Theorem). Therefore,

|A|IB != O, but |A| = 0,

O != O

This is impossible. Hence by contradiction, (adjA)B must always be equal to O.

This is wrong. There are many counter-examples that show this. Can anyone point out the mistake (be it elementary) in my pseudo-proof? Thanks.