Let's say you have a number $x$, and a priori, you know that $x \in [0, 1)$ (each value from 0 to 1 is equally likely.) Then a wizard comes and tells you that $x \in [a, b) \subseteq [0, 1)$. How much information does this give you?
It would seem to be $-\log_2(b-a)$ bits, but I don't know how to prove this, since both a priori and a posteriori have an infinite amount of entropy.
The reason I think $-\log_2(b-a)$ bits seems reasonably is it seems to agree with examples.
$-\log_2(1-0) = 0$, which is true, since no information is conveyed.
$-\log_2(\frac12 - 0) = 1$, which seems reasonably, since it would give you the first binary digit of $x$
In general, $-\log_2(b-a=\frac1{2^n})=n$, seems reasonable, as it gives you about $n$ binary digits.
$\def\pp{\mathbb{P}}$The usual definition of conditional entropy of $A$ given $B$ works. It is defined as the expected conditional entropy, which is the expected weighted negative log conditional probability. This corresponds to your correct intuition since in this case the weight function is uniform since every value in $[0,1)$ is equally likely. You can see more details at Wikipedia, which will also allow you to calculate the conditional entropy if the distribution is not uniform.