Starting with w=1, each time we multiply w by a number x sampled independently and uniformly from [1/2, 3/2] until it is smaller than a given value c. What's the expected number of rounds for this process?
I have tried to model it using a function f where f(c) denotes the answer for given c. Using a recursive way, I can derive the following:
$ f(c) = 1, \text{ if } c > \frac{3}{2} $
$ f(c) = 1 + \int_{c}^{\frac{3}{2}} f(\frac{c}{x}) dx, \text{ if } c \in [\frac{1}{2},\frac{3}{2}] $
$ f(c) = 1 + \int_{\frac{1}{2}}^{\frac{3}{2}} f(\frac{c}{x}) dx, \text{ if } c < \frac{1}{2} $
However I'm not able to proceed since it seems complicated.
Addition: I care more about the case where $c$ is around 1.
Thanks for any answers and ideas!

The expected value is going to be given by the infinite sum $$ E(steps)=\sum_{n=1}^\infty n\cdot P(w<c:\textit{n steps}) $$
where the probability can be given by a binomial distribution. $$ P(w<c:\textit{n steps}) = \sum_{k=0}\binom{n}{k}\left(\frac{1}{2}\right)^k\left(\frac{3}{2}\right)^{n-k} $$ summing over all values where $$ \left(\frac{1}{2}\right)^k\left(\frac{3}{2}\right)^{n-k} < c. $$
I'll leave it to you to fill in the gaps