Suppose that $X$ is a discrete random variable with a support of size $n$, e.g., $X$ could be some index in the range $\{1,\dots,n\}$. Our goal is to guess the value of $X$. Initially, $X$ is uniformly distributed, i.e., for the entropy, we have $H[ X ] = \log n.$ From the fact that $X$ is uniform, we know that guessing $X$ will require $\Theta(n)$ tries in expectation.
Now assume that we get some knowledge by observing a random variable $Y$ such that $$ H[ X \mid Y ] = \frac{1}{2}\log n. $$
Does this reduction in entropy imply a smaller bound on the (expected) number of guesses required for finding the value of $X$?
At first, I thought the intuition of thinking of $Y$ as restricting $X$ to a subset of its support should imply that range that we would need to guess for $X$ is smaller, but this is only true if the distribution $p(X \mid Y)$ is uniform.
The expected number of guesses for finding the value of $X$ by guessing the possible values in order of decreasing probability is
$$ \sum_kkp_{(k)}\;, $$
where $p_{(k)}$ is the probability of the $k$-th largest value. This is strictly smaller for a non-uniform distribution than for a uniform distribution. Thus, since your distribution is no longer uniform, you now have a lower expected number of guesses.
However, it would be misleading to say that this is due to the reduction in entropy. It's not the case more generally that a distribution with a lower entropy has a lower number of expected guesses. For instance, for $p_1=p_2=\frac12$ and $p_3=0$ we expect $\frac32$ guesses and the entropy is $\log2$, whereas for $p_1=0.7$, $p_2=0.25$ and $p_3=0.05$ we only expect $1.35$ guesses but the entropy is about $0.746\gt\log2$.