This is a claim in the book I am reading on Differential Geometry (Curves and Surfaces, 2nd Edition, by Montiel and Ros):
The eigenvalues of a symmetric bilinear form on a two-dimensional Euclidean vector space can be obtained as the maximum and the minimum values taken by the form on the circle of unit vectors.
How to prove this claim?
Any hints or references will be the most appreciated.
Thanks in advance.
I'll show the claim about the eigenvalues being attained as the minimum and maximum over the unit circle. This requires already knowing a proof of a standard result, which is that a symmetric $2 \times 2$ matrix always admits a pair $(u, v)$ of orthogonal eigenvectors. This result (or rather its generalisation to an $n \times n$ symmetric matrix) is called the spectral theorem, but I have attached a short elementary proof in the $2 \times 2$ case below.
Let our symmetric $2 \times 2$ matrix have orthogonal unit eigenvectors $u, v \in \mathbb{R}^2$, with corresponding eigenvalues $S u = \lambda u$ and $S v = \mu v$. Since $u$ and $v$ are orthogonal unit vectors, we can describe the unit circle as the set of points $$ p = au + bv \text{ satisfying } a^2 + b^2 = 1. $$ The associated quadratic form to $S$ is $Q(p) = (p \cdot Sp)$, and for $p = au + bv$ we have $$ Q(p) = (au + bv) \cdot (a Su + b Sv) = (au + bv) \cdot (\lambda au + \mu bv) = \lambda a^2 + \mu b^2. $$ The expression $\lambda a^2 + \mu b^2$ is a convex combination of $\lambda$ and $\mu$, hence always takes values between $\lambda$ and $\mu$. We can see that the extrema are attained at $Q(u) = \lambda$ and $Q(v) = \mu$.
Proof of standard result:
If $S$ is a symmetric $2 \times 2$ matrix with $$ S = \begin{pmatrix} a & b \\ b & c \end{pmatrix}, $$ then the characteristic polynomial of $S$ is $\chi_S(t) = t^2 - t(a + c) + (ac - b^2)$. The discriminant of this quadratic is $$ \Delta = (a + c)^2 - 4(ac - b^2) = a^2 + 2ac + c^2 - 4ac + 4b^2 = (a - c)^2 + 4b^2 \geq 0,$$ showing that it always has real roots. If the discriminant $\Delta >0$ then $S$ has two distinct real eigenvalues and hence is diagonalisable. If the discriminant $\Delta=0$ then we must have $a = c$ and $b=0$, showing $S$ is a scalar multiple of the identity matrix and is diagonalisable.
In case where $S$ is diagonalisable with distinct eigenvalues, choose two eigenvectors $u, v$ with $Su = \lambda u$ and $Sv = \mu v$ with $\lambda \neq \mu$. We have $$ \lambda (u \cdot v) = (Su \cdot v) = (u \cdot Sv) = (u \cdot v) \mu$$ and hence $(\lambda - \mu)(u \cdot v) = 0$, showing that $u \cdot v = 0$. (Here we used the symmetry property of $S$ to do $(Su \cdot v) = (u \cdot Sv)$). Hence eigenvectors for distinct eigenvalues are orthogonal. In the case where $S$ is a scalar multiple of the identity, we can simply choose two orthogonal vectors which will immediately be eigenvectors of $S$.
Therefore, a symmetric $2 \times 2$ matrix $S$ always admits two orthogonal eigenvectors.