Let $f(x)$ be a univariate continuous function with a certain number of double roots. For any interval $[a,b]$, can you tell how much roots of $f(x)$ are in that interval? The specific function I had in mind was $$f(x)=\sin^2(\frac{n}{x}\pi)+\sin^2(x \pi)$$ where $n$ can be any integer.
EDIT: I will also accept an algorithm that can tell, within an interval $[a,b]$, whether at least one root exists there.
It is guaranteed that, in any given interval, if $f'$ has $n$ zeros, $f$ has at most $n+1$, but I don't think there is a general rule that applies in all contexts (apart from Bolzano's Theorem)