Checking Invertibility of Square Matrix

117 Views Asked by At

The question is:

Which statement is the same as asserting a square matrix $Z\in\mathbb{R}^{(n,n)}$ is invertible.

  • That the columns of $Z$ span $\mathbb{R}^n$.
  • That $Z$ has no eigenvectors.
  • That $Z$ has no eigenvectors of eigenvalue $0$.
  • $\mathrm{ker}Z = \{\mathbf0\}$

D is true from the rank-nullity theorem and properties of linear maps. C is true since $0$ as an eigenvalue is equivalent to noninvertible. B is false by extension of C

I am unsure of A.

4

There are 4 best solutions below

0
On BEST ANSWER

You're right about your assertions. Let me just give some notes on them as well:


For (b), you might think about the identity, certainly invertible but with every vector an eigenvector of $1$(besides $\mathbf0$ as the null-vector never is an eigenvector).

For (c), as you rightly remarked, you have that for a matrix $Z$, $0$ is an eigenvalue of $Z$ iff $p_Z(0)=0$ iff $\det(Z-0E_n)=0$ iff $\det(Z)=0$ iff $Z$ is singular, i.e. not invertible.

For (d), the rank-nullity theorem seems to be the best way to go.


Now for (a), note that for $Z=(z_1,\dots,z_n)$ being the column representation of $Z$, we have that if $z_1,\dots,z_n$ span $\mathbb{R}^n$, then f.a $v\in\mathbb{R}^n$, there exists $x_1,\dots,x_n\in\mathbb{R}$ s.t. $\sum_{i=1}^nx_iz_i=v$. Thus, using matrix multiplication, for every $v\in\mathbb{R}^n$, there exists an $x\in\mathbb{R}^n$ s.t. $Zx=v$, as $Zx=\sum_{i=1}^nx_iz_i$ for $x=(x_1,\dots,x_n)$.

Thus, the linear map $\phi_Z(x)=Zx$ for $x\in\mathbb{R}^n$ is surjective. Now, by the rank-nullity theorem, a linear map between two spaces with the same dimension (as here $\mathbb{R}^n$) is surjective iff it is injective iff it is bijective. Note that this only holds in finite dimensions. Thus $\phi_Z$ is bijective and thus $Z$ is invertible.


To round this off, in your case of the finite dimensional $\mathbb{R}^n$ you have $Z$ is invertible iff the columns of $Z$ span $\mathbb{R}^n$ iff the columns of $Z$ are linearly independent. The last of which would correspond by a similar argument as above to injectivity(check this yourself) and thus we'd apply the argument again that this implies bijectivity rightout.

0
On

The statements follow the Invertible Matrix Theorem.. The 1st, 3rd and 4th all work but the 2nd doesn't make much sense.

A is true because if the columns span $\mathbb{R}^{n}$ they are linearly independent then invertible

0
On

A is true. $\mathbb R^n$ is an $n$-dimensional vector space, so if $n$ vectors span $\mathbb R^n$ then these $n$ vectors are a basis of $\mathbb R^n$. Hence the columns of the matrix $Z$ span $\mathbb R^n$ iff the columns are linearly independent, iff the system of equations $Zx=0$ has only the trivial solution, iff $Z$ is invertible.

0
On

We have

  • That the columns of Z span R^n

Yes indeed in that case the matrix is full rank and invertible.

  • That Z has no eigenvectors

This point does not make sense.

  • That Z has no eigenvectors of eigenvalue = 0

In that case A is not invertible since $\det(A)=\prod(\lambda_i)=0$

  • Kernel of Z = {0}

Yes indeed that implies that columns are independent ($A$ is full rank).