As far as my understanding of condition numbers go, they represent how much an error in input can change output.
What I don't understand is why for values of $\log(x)$ around $x=1$ is the condition number $C(x) = 1/\log(x)$ so large? Shouldn't $C(x)$ be larger as $x \to 0$?
$\log 1 = 0$. So $x \to 1$ then $\log x \to 0$. And $C(x) = \frac 1{\log x} \to \infty$.
$\log 0$ is undefined but $x \to 0^+$ then $\log x\to -\infty$ and so $C(x) = \frac 1{\log x} \to 0^-$.
You seem to be confusing $x$ with $\log x$. $C(x)= \frac 1 {\log x}$ will be "very large" when $\log x$ is "very close" to $0$. Which occurs precisely when $x \to 1$. It does not occur with $x \to 0$.