I have this question from a textbook by ovidu calin, introduction to stochastic calculus. Now, from the textbook, I know that for a diffusion $X_t = \mu t + \sigma W_t$, $E[\tau] = \frac{x}{\mu}$ for the mean time to hit level $x>0$.
This seems to be at odds with the question. I want to say that the mean times are different for the 2 process and there is a typo in the question. Can anyone verify my claim?