I have a matrix $M = \begin{pmatrix}\frac{1}{2}&\frac{2}{9}&0\\\frac{4}{9}&\frac{5}{9}&0\\\frac{1}{18}&\frac{2}{9}&1\end{pmatrix}$. I want to compute the sum $$\mathrm{E} = \sum_{i=1}^\infty{i\left({\hat{e_3}^\intercal M^i\hat{e_1} - \hat{e_3}^\intercal M^{i-1}\hat{e_1}}\right)}.$$
I tried rearranging it as follows similarly to how I would solve such a sum for scalar $M$: First, I factored the unit vectors and $M-I$ out of the sum $$\mathrm{E} = \hat{e_3}^\intercal\left({M-I}\right)\left({\sum_{i=1}^\infty{iM^{i-1}}}\right)\hat{e_1}.$$ Then, I rewrote $iM^{i-1}$ as $\frac{\mathrm{d}}{\mathrm{d}M}\left({M^i}\right)$ (which I'm not totally sure is justified for matrices like it is for scalars) and undistributed the derivative to get $$\mathrm{E} = \hat{e_3}^\intercal\left({M-I}\right)\left({\frac{\mathrm{d}}{\mathrm{d}M}\sum_{i=1}^\infty{M^i}}\right)\hat{e_1}.$$ This sum is a geometric series so we can rewrite like this (this part should be valid since $M$ has norm less than 1) $$\mathrm{E} = \hat{e_3}^\intercal\left({M-I}\right)\frac{\mathrm{d}}{\mathrm{d}M}\left({\frac{I}{I-M}}\right)\hat{e_1} = \hat{e_3}^\intercal\frac{\left({M-I}\right)}{\left({I-M}\right)^2}\hat{e_1}.$$ So $$\mathrm{E} = \hat{e_3}^\intercal\left({M-I}\right)^{-1}\hat{e_1}.$$ However, this doesnt't seem to be right because $M-I$ is not invertable. I know the series converges because $M$ is a stochastic matrix whose only positive eigenvector is $\begin{pmatrix}0\\0\\1\end{pmatrix}$ which is a pure state, and so $M^i\to\begin{pmatrix}0&0&0\\0&0&0\\1&1&1\end{pmatrix}$. Since $M$ is a stochastic matrix, $\mathrm{E}$ can be thought of as the expected number of applications of the transition associated with $M$ to $\hat{e_1}$ to reach $\hat{e_3}$. Furthermore I think that the series convergence of $M^i$ is geometric in some sense so $iM^i$ should also converge. If $iM^i$ did not converge, some element of it would have to diverge, but this would mean the norm of the matrix were equal to or greater than 1 which it is not.
I have tried considering the pseudoinverse of the matrix but $\hat{e_3}^\intercal\left({M-I}\right)^\mathrm{+}\hat{e_1} = 0$ so that did not work. Based on numerically computing the first 300 terms it appears the sum is about 7.2. Is there a way to compute a closed form exact solution? Where does my manipulation break down?
I'm still not sure why the original analysis doesn't work (I expect it's because $M$'s induced norm is 1 and in fact $M-I$ is not invertable), but I was able to compute the sum by diagonalizing the matrix. We see that $$M = SJS^{-1} = \begin{pmatrix} 0 & \frac{3}{2} + \frac{1}{6}\sqrt{129} & -\frac{1}{6}\sqrt{129} + \frac{3}{2} \\ 0 & -\frac{5}{2} - \frac{1}{6}\sqrt{129} & -\frac{5}{2} + \frac{1}{6}\sqrt{129} \\ 1 & 1 & 1 \end{pmatrix} \begin{pmatrix} 1 & 0 & 0 \\ 0 & \frac{19}{36}-\frac{1}{36}\sqrt{129} & 0 \\ 0 & 0 & \frac{19}{36}+\frac{1}{36}\sqrt{129} \end{pmatrix} \begin{pmatrix} 1 & 1 & 1 \\ -\frac{1}{2} + \frac{5}{86}\sqrt{129} & -\frac{1}{2} + \frac{3}{86}\sqrt{129} & 0 \\ -\frac{1}{2} - \frac{5}{86}\sqrt{129} & -\frac{1}{2} - \frac{3}{86}\sqrt{129} & 0 \end{pmatrix}.$$ And now the sum can be rewritten as $$ \begin{array}{rcl} \mathrm{E} & = & \hat{e_3}^\intercal\left({M-I}\right)\left({\sum_\limits{i=1}^\infty{iM^{i-1}}}\right)\hat{e_1} \\ & = & \hat{e_3}^\intercal\left({M-I}\right)\left({\sum_\limits{i=1}^\infty{i\left({SJS^{-1}}\right)^{i-1}}}\right)\hat{e_1} \\ & = & \hat{e_3}^\intercal\left({M-I}\right)S\left({\sum_\limits{i=1}^\infty{iJ^{i-1}}}\right)S^{-1}\hat{e_1} \end{array}.$$ But $J$ is a diagonal matrix so $\left({\sum_\limits{k}{f_k(J)}}\right)_{i,i} = \sum_\limits{k}{f_k\left({J_{i,i}}\right)}$, in particular, $\sum_\limits{i=1}^{\infty}{i\lambda^i} =\left({1-\lambda}\right)^{-2}$ so $$\sum_{i=1}^\infty{iJ^{i-1}} = \begin{pmatrix} \infty & 0 & 0 \\ 0 & \frac{16929}{800} - \frac{1377}{800}\sqrt{129} & 0 \\ 0 & 0 & \frac{16929}{800} + \frac{1377}{800}\sqrt{129} \end{pmatrix}.$$ Call this sum $\tilde{J}$. The divergence of the first eigenvalue component of $J$ looks like it might be problematic at first, but if we multiply out $\hat{e_3}^\intercal\left({M-I}\right)S$ we see that it is $\left({0, -\frac{17}{36} - \frac{1}{36}\sqrt{129}, -\frac{17}{36} + \frac{1}{36}\sqrt{129}}\right)$ which means the first row of $\tilde{J}$ does not matter. Now $\mathrm{E} = \hat{e_3}^\intercal\left({M-I}\right)S\tilde{J}S^{-1}\hat{e_1}$ which with a little help from my friend Sympy we can find is $\frac{36}{5}=7.2$ which agrees with the empirical result.