I have a homework question that asks:
"Consider the curve $\gamma : [1, \infty] \to \mathbb{R}^2$ defined by $\gamma (t) = \langle t \cos (\ln t), t \sin (\ln t) \rangle$. Show that this curve is not a bounded distance from a geodesic."
It looks a little cryptic, so here is how I decoded the statement. Say I have two points $p_1 = \gamma (t_1)$ and $p_2 = \gamma (t_2)$ on the curve (say $t_1 < t_2$). I get the straight line (a.k.a. the geodesic) between them to be $$ D = \{ (1-t) p_1 + t p_2 : t \in [0, 1] \}. $$ On the other hand, if I follow the curve $\gamma$, I get the set $$ G = \{ \gamma (t) : t \in [t_1, t_2] \}. $$ Now the question amounts to showing $$ \sup_{d \in D, g \in G} \{ dist(d, g) \} $$ is not bounded. I think this is the way to go about the question, but I don't know where to go from here.
EDIT: The supremum will of course be bounded for fixed $p_1$ and $p_2$, but I want it to depend on the choice of points $p_1$ and $p_2$ somehow. In other words, the curve $\gamma$ gets arbitrarily far away from straight lines connecting points of $\gamma$.