Spectral radius of a linear operator the iterates of which are linear combinations

79 Views Asked by At

Let $(X,||\cdot||)$ be a real Banach space, let $A_0,A_1:X\to\mathbb R$ be bounded linear functionals and let $v,w\in X$ be fixed. Then for the spectral radius $\mathfrak{R}$ of the operator $A:X\to X$, defined by \begin{align} Ax=vA_0x+wA_1x\quad\text{for } x\in X, \end{align} I was (hopefully) able to prove the relationship \begin{align} \mathfrak{R}(A)\leq\mathfrak{R} \begin{pmatrix} A_0v & A_1v \\ A_0w & A_1w \end{pmatrix}. \end{align} My question is: Do we have equality?

In order to prove the above inequality, I showed $$ A^{n+1}x=va_{n}x+w b_{n}x=v(a_{n}vA_0x+a_{n}wA_1x)+w(b_{n}vA_0x+b_{n}wA_1x), $$ where $a_n$ and $b_n$ are linear functionals on $X$ satisfying the recurrence \begin{align} a_{n+1}x=a_nvA_0x+a_nwA_1x &\quad\text{and}\quad a_1x:=A_0x, \\ b_{n+1}x=b_nvA_0x+b_nwA_1x &\quad\text{and}\quad b_1x:=A_1x. \end{align} Then I estimated the norm of the iterates $A^n$ by the row sum norm of the matrix having $A_0v$, $A_1v$, $A_0w$, $A_1w$ as entries and used Gelfand's formula.

Any help is highly appreciated. Thanks in advance!

1

There are 1 best solutions below

2
On BEST ANSWER

Let me call you matrix $B$, i.e. $$ B=\begin{bmatrix} A_0v&A_0w\\ A_1v&A_1w\end{bmatrix}. $$ Then $B$ has precisely the same nonzero eigenvalues as $A$ (that is, $\sigma(A)=\{0\}\cup\sigma(B)$); in particular, they have the same spectral radius.

Assume first that $v,w$ are linearly independent. Since $A$ has rank two, its spectrum consists of eigenvalues. Let $\lambda\in\sigma(A)$ be nonzero. So there exists $x\in X$ with $$\lambda x=Ax=(A_0x)v+(A_1x)w.$$ As $\lambda\ne0$, it follows that $x=\alpha v+\beta w$ for some $\alpha,\beta\in\mathbb C$. Then \begin{align} \lambda\alpha v+ \lambda \beta w &=A(\alpha v+\beta w)=\alpha Av+\beta Aw\\[0.3cm] &=\alpha((A_0v)v+(A_1v)w)+\beta((A_0w)v+(A_1w)w)\\[0.3cm] &=(\alpha A_0v+ \beta A_0w)v+ (\alpha A_1v+\beta A_1w)w. \end{align} As $v,w$ are linearly independent, we get the two equalities $$ \lambda \alpha=\alpha A_0v+ \beta A_0w,\ \ \lambda\beta=\alpha A_1v+\beta A_1w. $$ We can rewrite the two equalities as $$ \lambda\begin{bmatrix} \alpha\\ \beta\end{bmatrix}=\begin{bmatrix} A_0v&A_0w\\ A_1v&A_1w\end{bmatrix}\begin{bmatrix} \alpha\\ \beta\end{bmatrix}. $$ So $\lambda$ is an eigenvalue of $B$. By reversing the procedure, we can also show that if $\lambda$ is an eigenvalue of $B$, then it is an eigenvalue of $A$.

When $v,w$ are not linearly independent, we have $w=\gamma v$ for some $\gamma\in \mathbb C$. Then $$ Ax=(A_0x) v+ \gamma(A_1x)v=(A_0x+\gamma A_1x) v. $$ So $A$ is rank-one. If $\lambda\ne0$ and $Ax=x$, then $x=\alpha v$ for some nonzero $\alpha\in\mathbb C$. Then $$ \lambda\alpha v=\lambda x=Ax=\alpha Av=\alpha(A_0v+\gamma A_1v)v. $$ Thus $$ \lambda\alpha = \alpha(A_0v+\gamma A_1v). $$ So $$ B^T\begin{bmatrix} \alpha\\ \gamma\alpha\end{bmatrix} =\begin{bmatrix}A_0 v& A_1v\\ \gamma A_0v&\gamma A_1v \end{bmatrix} \begin{bmatrix} \alpha\\ \gamma\alpha\end{bmatrix} =\begin{bmatrix} \lambda\alpha\\ \lambda\gamma\alpha \end{bmatrix} =\lambda \begin{bmatrix} \alpha\\ \gamma\alpha \end{bmatrix} , $$ showing that $\lambda$ is an eigenvalue of $B^T$ (and thus of $B$). Again we can revert the steps, and we get again $\sigma(A)=\{0\}\cup\sigma(B)$.