Problem: We want to eliminate $n\geq 0$ until $n \leq 0$ (Success). There are two things we can do with probabilities:
$p_1$ Deduct 1 from $n$
$p_2$ Deduct $\frac{n}{4}$ from $n$
If $p_1 = p_2 = \frac{1}{2}$, how many expected steps will it take for $n \leq 0$.
My unresolved approach: Since the events are uniform probability, we know the expected number of steps to run $p_1$ and $p_2$ is 2. My intuition tells me that out of two steps, we will run both these events. In order to reach our success, the expect number of steps should be the number of times we take $log_{1.5}n$ then minus 1. I'm unsure how to show this and would like some help.
It's instructive to analyze a few base cases near 0.
For $n \in (0,1]$, you are looking for the expected number of flips of an unbiased coin until you get heads, which is simply 2.
For $n \in (1, \frac{4}{3}]$, your first step will always map you to $(0,1]$, and so the expected number of steps is exactly 3.
For $n \in (\frac{4}{3}, \frac{16}{9}]$, the first step will either map you to $(\frac{1}{3},\frac{7}{9})$, a subinterval of $(0,1]$, or to $(1,\frac{4}{3}]$, with equal probability. So the expected number of steps is $3.5$.
This suggests that we can identify a sequence of interval endpoints, $0=s_0 < s_1 < s_2 < \ldots$, such that the expected number of flips for $n\in (s_k, s_{k+1}]$ is a constant $x_k$. We simply need to figure out the appropriate sequences $\{s_k\}$ and $\{x_k\}$, all the way up until the particular value of $n$ we are interested in. This can be done by computer via dynamic programming.
If you are interested in an asymptotic solution, note that the expected number of steps to get from $n$ to a number less than $\frac{3n}{4}$ approaches a limit of 2 as $n\rightarrow \infty$, since it is equivalent to coin-flipping until you get heads. To get from large $n$ to $c$ would thus require $2\cdot \log_{4/3}(n/c) = \Theta(\log(n))$ expected flips for large constants $c$. Since going from a constant to $0$ takes $\Theta(1)$ flips, the total expected flips is $\Theta(\log(n))+\Theta(1) = \Theta(\log(n))$. A reasonable approximation might be $2\cdot \log_{4/3}(n) + A$ for some constant $A$.