I am quite intrigued by the following well-known result:
Result 1: A real symmetric matrix has real eigenvalues.
Suppose $A$ is a symmetric matrix of size $n \times n$ with entries in the field of real numbers.
(1) Let $f$ be a linear operator on $\mathbb{R}^n$ whose matrix w.r.t. some basis (not necessarily the standard basis) is $A$. Does the fact that $A$ is symmetric say anything about $f$? As in, does $f$ have any 'special' properties which other operators do not have?
(2) Let $B$ be the matrix of $f$ w.r.t. some other basis which is distinct from the basis used for $A$ (again, the basis for $B$ need not be the standard basis). Is $B$ also always symmetric?
(3) Since the notion of eigenvalues (& eigenvectors) is basis-independent, I would like to know if the following statement is true: If $f$ has a symmetric matrix w.r.t. some basis of $\mathbb{R}^n$, then $f$ has at least one real eigenvalue. Moreover, all eigenvalues of $f$ are real.
(4) Most proofs of Result 1 above use the notion of inner product, and thus implicitly assume that we are working in an inner product space. However the study of eigenvalues does not require that the vector space have an inner product. Therefore, I suspect it should be possible to prove Result 1 without using inner products. Is there such a proof? (If yes, a link would be sufficient.)
The well known spectral theorem says even more than your result 1.
So if $f:\mathbb{R}^n\to \mathbb{R}^n$ is a linear operator such that w.r.t. some basis $\alpha$, we have that $[f]_{\alpha}^{\alpha}=A$, then $f$ is a diagonalizable operator.
Moreover, suppose $\beta$ is any other basis, then the matrix of $f$ w.r.t. this new basis is $P^{-1}AP$ where $P$ is the proper transition matrix. Notice that \begin{eqnarray*} (P^{-1}AP)^T &=& P^TA^T(P^T)^{-1}\\ &=& P^TA(P^T)^{-1}. \end{eqnarray*} So in general you cannot expect $P^{-1}AP$ to be symmetric. If $P$ is orthogonal, i.e. $P^{-1}=P^T$ then $P^{-1}AP$ is symmetric as well. Hence orthogonal base changes will preserve symmetry.
I guess you could ask whether the above property actually characterizes orthogonal matrices. That is, suppose $P$ is matrix such that for any symmetric matrix $A$ we have that $P^{-1}AP$ is symmetric, then $P$ is orthogonal? (EDIT: Fun fact, I just checked whether a matrix $P$ satisfying $P^{-1}AP$ is symmetric for all symmetric $A$, is orthogonal. I found that $PP^T=\lambda Id$, suffices, but usually people call a matrix orthogonal if $PP^T=Id$. One should actually refer to such matrices as orthonormal matrices.)
Point 3 becomes obsolete as the spectral theorem implies that all eigenvalues of $A$ (or $f$) are real.
Point 4: Any finite-dimensional vector space essentially has only one inner-product and you can always define this inner-product. So why not use it?