I am trying to show that there exists a stationary distribution when $q>p$ for the Markov process with one-step transition matrix $$ \begin{bmatrix} q & p & 0 & 0 & \dots & 0 \\ q & 0 & p & 0 & \dots & \dots \\ 0 & q & 0 & p & \dots & \dots \\ 0 & 0 & q & 0 & \dots & \dots \\ \vdots & \vdots & \vdots & q & \dots & \dots \\ \vdots&\vdots&\vdots&\vdots&\dots& \dots& \\ 0 & 0 & 0 & 0 & \ddots & \ddots \end{bmatrix} $$
But, observing each resulting equality like \begin{align*} q\pi_{0} + q \pi_{1} &= \pi_{0} \\ p \pi_{0} + q \pi_{2} &= \pi_{1} \\ \dots \end{align*}
I cannot establish a general pattern for each $\pi_{i}$ and then prove that they sum to $1$. Where am I going wrong? Or rather, am I just missing a pattern? It seems to be a formula that does not have a pattern.
The algebra is straightforward. We know $\pi P=\pi$ must be satisfied. The first equation gives us $\pi_1q=\pi_0p$. Take the equation: $\pi_2 q+\pi_0p=\pi_1$. A simplification gives us $\pi_2=(p/q)^2\pi_0$. Continuing this way, in general, you get: $\pi_k=(p/q)^k \pi_0$.
Summing up the $\pi$s to 1, you will find $\pi_0=1-p/q$. The other probabilities can be evaluated from this.