Sum of the columns of a matrix and eigenvalues

1.2k Views Asked by At

This is not homework; it is self studying. I saw a similar question here . However, my question is when the columns sum to one; this means the matrix has a eigenvalue of one. Obviously, the question in the link applies to this too. I was just wondering if my solution is correct. If it is not please suggest a hint (if possible) without revealing the answer.

Assume no knowledge that the eigenvalues of ${A}$ and ${A^t}$ are the same.

Assume we know the following facts based on previous solved problems:

1) sum of the rows equals ${s}$ implies ${A}$ has an eigenvalue that is ${s}$.

2) $T$ is the operator for matrix $A$ and $T'(\phi) :=\phi • T$ is the dual operator with matrix $A^t$

3) given a basis of dual vectors for $V'$, $\exists$ a basis of $V$ such that the basis of dual vectors is the dual basis to that basis of $V$.

Consider ${A^t}$, since the rows sum to 1, ${A^t}$ has an eigenvalue 1, thus $\exists{\psi}$ a corresponding eigenvector of ${A^t}$. Extend $\psi$ to a basis for $V'$ : $\psi, {\phi_2}, ... {\phi_n}$. Thus there exists a basis $v_{\psi},v_2,...v_n$ of $V$ such that $\psi, {\phi_2},...{\phi_n}$ is its (ordered) dual basis.

Thus using the definition of $T'$ and dual basis: $T'(\psi)(v_{\psi}) = {\psi} • T(v_{\psi}) = \psi(v_{\psi}) = 1 $

We seek to show: the definition of dual basis and $\psi •T(v_{\psi})=1 \implies T(v_{\psi})=v_{\psi}$

Assume $T(v_{\psi}) = w \neq v_{\psi}$ and, $w$ is linearly independent of $v_{\psi}$ and $\psi(w) = 1$. Then $T$ doesn't annihilate $span(v_{\psi}, w)$ , but does annihilate $v_2,...,v_n$. However $w, v_{\psi}, v_2, ...v_n$ can't be linearly independent since dimension of $V$ is $n$. Since, $w$ is linearly independent of $v_{\psi}$. We can conclude that $w \in span(v_2, ..., v_n)$. Thus $\psi(w) =0$

$\implies T(v_{\psi}) = v_{\psi}$

■.

Something feels slightly off with this proof. Especially, the conclusion that $T(v_{\psi})=v_{\psi}$.

Updated Attempt (with help from 'simple example' in the comments)

Assume just bullet point 1 and 2 above:

Since the columns of $A^t$ sum to 1, $T'$ has an eigenvalue of 1. Thus $\exists\psi \in V'$ that is an eigenvector of $T'$ with an eigenvalue of 1.

Thus for $\forall v \in V$: $T'(\psi)(v) = \psi(v) \implies \psi * T(v) = \psi(v)$ (definition of $T'$)

4) $\psi *T(v) - \psi(v) = 0$

5) $\psi(T(v) - v) = 0 $ (linearity)

6) $\psi((T - I_V)v) = 0$ (distributive property of linear maps)

In the non-trivial case, $\psi$ is not injective since it is a map from $V$ onto $F$ with $dim V_F \gt dim F_F =1$, where $F$ is the base field. Thus $null(\psi) \neq \{0\}$.

To satisfy point "6)", $range(T-I_V) \subseteq null(\psi)$. Thus $dim(range(T-I_V)) \le dim(null(\psi)) \lt dim(V)$ ($\psi \neq 0_V$)

$dim(range(T-I_V)) \lt dim(V) \implies T-I_V$ is NOT surjective and thus 1 is an eigenvalue of $T$. $\Box$

To cover the trivial case: If $dim(V)=dim(F)=1$, then $\psi$ is injective, thus $null(\psi)=\{0\} \implies (T-I_V) = 0_V \implies T-I_v$ is not surjective

Verification on Simple Example

$A = [[0, -1], [1, 2]], \psi=[1, 1]$, where $A$ is the matrix for $T$, and $A^t$ is the matrix for $T'$

Let $\phi=[1, 0]$ and $\psi,\phi$ form a basis for $V'$. Thus we can find a basis of $V$ this is dual to: $v_{\psi}, v_{\phi}$, $v_{\psi}=[0, 1]^t, v_{\phi}=[1, -1]^t$.

$A^t\psi^t = \psi^t \implies \psi$ is an eigenvector of $A^t$ as desired

Following the conclusion of the argument above:

$ A-I = [[-1,-1],[1, 1]] = S$, consider $S$'s action on the basis $v_{\psi}, v_{\phi}$ of $V$.

$Sv_{\phi} = [0,0]^t \implies \psi([0, 0]^t) = 0$

$Sv_{\psi} = [-1, 1]^t \implies \psi([-1, 1]^t) = 0$

Since $v_{\psi}, v_{\phi}$ constitutes a basis for $V$, $range(S)=range(A-I) \subseteq null(\psi)$

So $S$ is not surjective, and thus $A$ has an eigenvalue of 1.

Is the updated attempt above a correct proof? If not, please continue to only give hints, if possible.