How is the determinant related to the inverse of matrix?

80.4k Views Asked by At

Whenever I needed to find the inverse of a matrix, I was told to check if its determinant is not zero. However, once I directly applied the Gauss-Jordan's method for finding the inverse of matrix whose determinant was zero. The inverse matrix that I got looked pretty normal like any other (if there wasn't a mistake).

I want to know how does the determinant of the matrix is related to inverse of matrix or why is that if determinant is zero then inverse doesn't exist? What exactly is inverse?

3

There are 3 best solutions below

2
On

The inverse of a matrix exists if and only if the determinant is non-zero. You probably made a mistake somewhere when you applied Gauss-Jordan's method.

One of the defining property of the determinant function is that if the rows of a nxn matrix are not linearly independent, then its determinant has to equal zero. In fact, the determinant function is constructed based on several desired properties(one of them being if the rows are dependent, its determinant is zero).

To answer your question what exactly is an inverse, Let's say if A is a nxn matrix, then its inverse is a matrix such that AB = BA = I. They don't necessarily exists though.

0
On

It holds that $\det(AB)=\det(A)\det(B)$, so that $\det(A)\det(A^{-1})=1$. In other words, an invertible matrix has (multiplicatively) invertible determinant. (If you work over a field, this means just that the determinant is non-zero.)

On the other hand, if the determinant is invertible, then so is the matrix itself because of the relation to its adjugate.

0
On

Question: "I want to know how does the determinant of the matrix is related to inverse of matrix or why is that if determinant is zero then inverse doesn't exist? What exactly is inverse?"

Answer: You may use the Cayley-Hamilton theorem to prove that an $n\times n$ matrix $A:=(a_{i,j})\in M_n(k)$ with $k$ any commutative unital ring, is invertible iff $det(A)\in k^*$ is a unit.

The Cayley-Hamilton theorem says the following: Let $I(n)$ be the $n\times n$-identity matrix and let

$$p_A(t):=det(tI(n)-A)\in k[t].$$

It follows $p_A(t)$ is a monic polynomial

$$p_A(t)=t^n-tr(A)t^{n-1} + \cdots +(-1)^n det(A)$$

with coefficients in $k$ and $(-1)^ndet(A)$ as "constant term". The Cayley-Hamilton theorem says that

$$p_A(A):=A^n-tr(A)A^{n-1}+\cdots + (-1)^ndet(A)I(n)=$$

$$A^n+a_{n-1}A^{n-1}+\cdots + a_1A +(-1)^n det(A)I(n)=(0)$$

is the zero matrix $(0)$. From this we get an explicit formula for the adjugate matrix, $adj(A)$ in terms of the coefficients $a_i$ of $p_A(t)$ and $A$: We get

$$A(A^{n-1}+a_{n-1}A^{n-1}+\cdots +a_1I(n))=Aadj^*(A)=adj^*(A)A=(-1)^{n+1}det(A)I(n),$$

since $A$ commutes with any power $A^j$ for $j\geq 1$ an integer.

Here we define

$$adj^*(A):=A^{n-1}+a_{n-1}A^{n-1}+\cdots +a_1I(n),$$

and we may define

$$adj(A):=(-1)^{n+1}adj^*(A)$$

where

$$adj^*(A):=A^{n-1}+a_{n-1}A^{n-1}+\cdots +a_1I(n).$$

From this it follows

$$Aadj(A)=adj(A)A=det(A)I(n).$$

The determinant $det(-)$ has the property that $det(AB)=det(A)det(B)$ and we get the following result:

Lemma: A matrix $A\in M_n(k)$ is invertible iff $det(A)\in k^*$ is a unit.

Proof: If $A$ is invertible there is a matrix $B$ with $AB=BA=I(n)$ hence

$$det(AB)=det(A)det(B)=1\in k^*$$

and it follows $det(A)$ is a unit. Conversely if $det(A)\in k^*$ define

$$A^{-1}:=\frac{1}{det(A)}adj(A)\in M_n(k).$$

It follows $AA^{-1}=I(n)=A^{-1}A,$

hence $A$ is invertible. We get an explicit formula for the adjunct matrix $adj(A)$ in terms of the coefficients $a_i$ of $p_A(t)$.