I'm trying to find the leading eigenvalue and corresponding left and right eigenvectors of the following infinite matrix, for $\lambda>0$:
$$ \mathrm{A}=\left( \begin{array}{cccccc} 1 &e^{-\lambda} & 0 &0 &0 & \dots\\ 1 &e^{-\lambda} & e^{-2\lambda} &0 &0 & \dots\\ 1 &e^{-\lambda} & e^{-2\lambda} &e^{-3\lambda} &0 & \dots\\ \vdots & \vdots & \vdots & & \ddots \end{array} \right) $$
Note that there are terms above the main diagonal.
I know that in general infinite matrices aren't really a self-consistent idea, but from doing it numerically with $n\times n$ matrices using power iteration it looks like the problem converges in the limit of infinite $n$. The convergence is slower for smaller values of $\lambda$, but it looks like it probably converges for all $\lambda>0$.
Note that I only care about the leading eigenvalue, i.e. the one with the largest magnitude, which should be real and positive. Its corresponding eigenvectors should have only positive entries, due to the Perron-Frobenius theorem.
Alternatively, if it's easier, a solution for the following matrix will be just as useful to me: $$ \mathrm{B}=\left( \begin{array}{cccccc} 1 & 1& 0 &0 &0 & \dots\\ e^{-\lambda} &e^{-\lambda} & e^{-\lambda} &0 &0 & \dots\\ e^{-2\lambda} & e^{-2\lambda} &e^{-2\lambda} &e^{-2\lambda} &0 & \dots\\ \vdots & \vdots & \vdots & & \ddots \end{array} \right) $$
Again note the terms above the diagonal. (The two problems are not equivalent, it's just that either one of them will help me solve a larger problem.)
The problem is, I just don't have much of an idea how to do this. I've tried a variety of naive methods, along the lines of writing the eigenvalue equation $\mathrm{A}\mathbf{x} = \eta \mathbf{x}$ as a system of equations involving infinite sums and then trying to find $\{x_i >0\}$ and $\eta>0$ to satisfy them, but this doesn't seem to lead anywhere nice.
It could be that there is no analytical solution. Or even worse it could be that these matrices have unbounded spectra after all (in which case I'd really like to know!), but if anyone has any insight into how to solve one of these two problems I'd really appreciate it.
(No promises here, because I haven't worked the algebra all the way through, but...)
Try writing the matrix equation $A\mathbf{x}= \mu \mathbf{x}$, and then looking at the individual equations it gives;
$$ x_1 + e^{-\lambda}x_2 = \mu x_1 \\ x_1 + e^{-\lambda}x_2 + e^{-2 \lambda}x_3 = \mu x_2 \\x_1 + e^{-\lambda}x_2 + e^{-2 \lambda}x_3 + e^{-3 \lambda}x_4 = \mu x_3 $$ etcetera.
Notice that you can substitute previous equations in to each one to get that for each $i \geq 1$, $$\mu x_i + e^{-(i+1)\lambda}x_{i+2} = \mu x_{i+1} \\ \Rightarrow \mu (x_{i+1} - x_i) = e^{-(i+1)\lambda}x_{i+2} $$
You can then solve the recurrence relation similarly to a differential equation, by finding two distinct (neither a constant multiple of the other) solutions, which gives the general solution of any linear combination of these.
Then substitute the general solution into the first equation and see if you can make it consistent with that.
Finding the general solution to the recurrence could still be difficult though - the $e^{-(i+1)\lambda}$ means that the $x_i = k^i$ trick will fail to work. I'm not actually sure if there is a solution of a different form, or if actually you can show that there is no solution - in which case, it would follow that $A$ did not have eigenvalues.
Sorry if this is what you've tried, but you mentioned infinite sums, and this at least doesn't involve them.