I believe this problem is missing an assumption. We have two differentiable functions, $f$ and $g$, which both vanish at $0$. Let $t > 0$ with $f(t), g(t) > 0$. Define $h(x) = f(t) g(x) - g(t) f(x)$. The assertion is: there exists $a \in (0,t)$ such that $$ \frac{f(t)}{g(t)} = \frac{f'(a)}{g'(a)}. $$
I don't think I am able to conclude this quotient, specifically because I have no way of knowing that $g'(a) \neq 0$. Here is what I did.
We compute that, as $f(0) = 0 = g(0)$, that $h(0) = 0$ and $h(t) = 0$. As $f$ and $g$ are differentiable, they are continuous, so $h$ is differentiable and continuous. By Rolle's theorem, there exists $a \in (0,t)$ so that $$ h'(a) = f(t) g'(a) - g(t) f'(a) = 0. $$ Then $$ f(t) g'(a) = g(t) f'(a). $$
The result about says that I can divide by $f(t)$ and by $g'(a)$. I know $f(t) > 0$, so that isn't a problem, but I am not fully convinced that $g'(a)$ cannot be $0$. One "intuitive" argument I can think of is: $f(t) > 0$ and $a$ is in a tiny neighborhood between $0$ and $t$. The function is continuous, so it wouldn't necessarily make sense for it to jump back to the $x$-axis. I tried writing out the definition of the derivative of $g'(a)$, but it didn't give any insight.
Is there another assumption needed in this problem, or am I missing something?