I'm trying to understand the derivation of Shannon's formula for entropy and information content. I follow the proof up to here:
For any $t,s \in (0,1]$ $$\frac{I(t)}{I(s)}=\frac{log(t)}{log(s)}=\frac{-Klog(t)}{-Klog(s)}$$ where K is an arbitrary positive constant
It is also given that I is a continuous function from (0,1] to $\mathbb{R}$
How can I show (I assume using some sort of continuity argument) that $I(x)=-Klog(x)$ for $x \in (0,1]$