I have a square matrix (that comes from a Markov Chain) that looks like that:
$$Q = \begin{bmatrix} 0 & 1& 0 & 0 & .. & 0 & 0\\ 0 & a & 1-a & 0 & .. & 0 & 0\\ 0 & 0 & b & 1 - b & .. & 0 & 0\\ .. & .. & .. & .. & .. & .. & .. \\ 0 & 0 & 0 & 0 & .. & 0 & 1 \end{bmatrix}$$
with $a, b, c,$ etc. real numbers between $0$ and $1$ included.
I am interested in the values of the first line of the matrix $Q^N$. Currently, I am using a library (numpy) that allows me to compute $Q^N$ and then I read the first line of this matrix. But with large matrices ($> 500 \times 500$) and large values of $N$ (~ 10000), it is a bit too slow for my usage.
By curiosity, I have plotted $Q^N_{0,j}$ for $N$ between 1 to 1000 and I found that they follow something that looks like a Poisson distribution or similar (but I don't know if it's only due to my specific input matrix $Q$ or not).
My question is, given such a matrix Q, is there a way to get the values of $Q^N_{0,j}$ without having to compute $Q^N$?
Edit: the terms on the diagonal are such as: 0 <= a <= b <= c <= ... < 1
Edit2:
It appears that if I can diagonalize $Q$, I can use $Q^n = P D^n P^{-1}$ which is faster for large values of $n$ than an exponentiation by squaring (as used by numpy). Problem is that I am not sure it is possible for any matrix $Q$. And if it's not possible, I'd accept a slight modification of $Q$, $P$ or $D$ if the result is close enough.
Edit3:
The values of the first line of $Q^N$ for $N \in [1,100]$:

Edit4:
it's $a <= b$ and not $a < b$, sorry!
A general bidiagonal matrix is not always diagonalizable; for instance if $a=b=\ldots=\frac{1}{2}$ you will get a Jordan block of eigenvalues $\frac{1}{2}$.
But, since all of your entries are distinct, you're in luck. I'm going to change your notation a bit:
$$M = \left[\begin{array}{ccccc} a_1 & 1-a_1 & & &\\ & a_2 & 1-a_2 & &\\ & & \ddots & &\\ & & & a_{n-1} & 1-a_{n-1}\\ & & & & 1\end{array}\right].$$ (In your case, $a_1 = 0$.)
Then the eigenvalues of $M$ are simply $a_1, \ldots, a_{n-1},1$. The eigenvector corresponding to $1$ is obviously the constant 1 vector. The eigenvector for $a_i$ is the vector $v^i$, where $$v^i_j = \begin{cases}0, &j > i\\ \frac{\prod_{k=j}^{i-1} (a_k-1)}{\prod_{k=j}^{i-1} (a_k-a_i)}, & j \leq i\end{cases}$$ and $v^i_i = 1$ by the empty product convention. You can check this formula by looking at the $j$th row of $Mv^i$:
$$a_j \frac{\prod_{k=j}^{i-1} (a_k-1)}{\prod_{k=j}^{i-1} (a_k-a_i)} + (1-a_j+1)\frac{\prod_{k=j}^{i-1} (a_k-1)}{\prod_{k=j+1}^{i-1} (a_k-a_i)}=\left(a_j+\frac{(1-a_j)(a_j-a_i)}{a_j-1}\right)v^i_j = a_i v^i_j.$$
So now that you have the eigenvectors and eigenvalues, you can decompose $M = BDB^{-1}$ and compute $M^n = BD^nB^{-1}$.
EDIT: Here is the formula for $B^{-1}$. Write $$B = \left[\begin{array}{cccc} v^1 & v^2 & \ldots & v^n\end{array}\right]$$ where $v^i$ is as above and $v^n$ is the one vector. Then $$B^{-1} = \left[\begin{array}{cccc} w^1 & w^2 & \ldots & w^n\end{array}\right]$$ with $$w^i_j = \begin{cases}0, &j > i\\ \frac{\prod_{k=j}^{i-1} (a_k-1)}{\prod_{k=j+1}^{i} (a_k-a_j)}, & j \leq i\end{cases}$$ for $i<n$ and $$w^n_j = \begin{cases}1, &j = n\\ -\frac{\prod_{k=j+1}^{n-1} (a_k-1)}{\prod_{k=j}^{n-2} (a_k-a_{n-1})}, & j < n.\end{cases}$$