I'm having some trouble with the following exercise:
Let $\alpha:(a,b)\to \mathbb R^2$ be a regular curve such that, the normal line of the curve at every point intersects in one single point. Prove that $\alpha((a,b))$ is a subset of a circle.
Let $Q\in \mathbb R^2$ be the point of intersection. I'm almost 100% sure that this will be the center of the circle, and we can assume that $Q=(0,0)$ because if it isn't, we can just make a translation $Q\to(0,0)$ and then undo the translation. I tried proving this in the following ways:
- Prving that for any $t,t_0\in(a,b)$, $|\alpha(t)|=|\alpha(t_0)|$
- Proving that if $|\alpha(t)|<|\alpha(t_0)|$ then there is a point $t<c<t_0$ such that the normal line at $\alpha(c)$ does not intersect $(0,0)$
- Proving that the curvature at every point is equal to $1/R$ for some constant $R$
But I wasn't able to do so. Are any of these viable ways of solving this? If so, how?
I thought I was able to solve it. I'll leave the answer here instead of deleting the post, so people with the same question can find the answer.
$|\alpha(t)|$ being constant, is the same as saying that $\left<\alpha(t),\alpha(t)\right>$ is constant. $$\frac{d}{dt} \left<\alpha(t),\alpha(t)\right>=2\left<\alpha'(t),\alpha(t)\right>$$
$\alpha'(t)$ and $\alpha(t)$ are orthogonal (This is because, at every point $\alpha(t)$ of the curve, the normal line goes through the origin, meaning that the normal vector at $\alpha(t)$ and the vector $\alpha(t)$ itself are colinear, so $\alpha(t)$ and $\alpha'(t)$ are orthogonal), thus $\frac{d}{dt} \left<\alpha(t),\alpha(t)\right>=0$, meaning that $\left<\alpha(t),\alpha(t)\right>$ is constant, and thus $|\alpha(t)|=R$, for all $t$.