Bob can drive m miles in h hours. How long will it take him to drive h miles?
I came up with two concrete examples. Suppose he drove 6 miles in 1 hour. Then to drive 1 mile, it will take him a sixth of the time.
Now if he drove 6 miles in 2 hours, to drive 2 miles it would take him 1/3 of that time.
However, the answer book says that the answer is $h^2/m$. I can see how they arrive at this answer $(m/h*x = m$ and solve for $x)$, but is my first reasoning wrong? In that case it would have simply been $h/m$.
Your first example’s answer is $\displaystyle\frac{h^2}m$ hour $\displaystyle=\frac{1^2}6$ hour.
Your second example is wrong though: a constant speed of $6$ miles in $2$ hours instead means $\frac66=1$ mile in $\frac26=\frac13$ hour, which means $2\times1=2$ miles in $2\times\frac13=\frac23$ hour; this again corresponds to $\displaystyle\frac{h^2}m=\frac{2^2}6=\frac23$ hour.