If $\mathbf{A}\colon V \to V$ is selfadjoint for some finite dimensional vector space $V$ over $\mathbb{C}$ then this follows immediately, since for any nonzero eigenvalue $\lambda $ and a corresponding eigenvector $\mathbf{v}$ we have \begin{align*} \lambda \left\langle \mathbf{v}, \mathbf{v}\right\rangle = \left\langle \mathbf{A}\mathbf{v}, \mathbf{v}\right\rangle = \left\langle \mathbf{v}, \mathbf{A}\mathbf{v}\right\rangle = \overline{\lambda }\left\langle \mathbf{v}, \mathbf{v}\right\rangle .\end{align*} Can one argue in a similar way if $V$ is a vectorspace over $\mathbb{R}$?
I thought about just considering the same vector space over $\mathbb{C}$ which, however, gives rise to several questions: Is $\mathbf{A}$ again self adjoint over $\mathbb{C}$? Can we even extend an inner product of $V$ w.r.t. to $\mathbb{R}$ to $\mathbb{C}$?
For the latter question I would say yes, since (if we assumed we already have such an inner product over $\mathbb{C}$) \begin{align*} \left\langle \mathbf{v}, \mathbf{w}\right\rangle _{\mathbb{C}} &= \left\langle \mathbf{a} + i\mathbf{b}, \mathbf{c} + i\mathbf{d}\right\rangle _{\mathbb{C}} \\ &= \left\langle \mathbf{a}, \mathbf{c}\right\rangle _{\mathbb{C}} + \left\langle \mathbf{a},i \mathbf{d}\right\rangle _{\mathbb{C}} + \left\langle i \mathbf{b}, \mathbf{c}\right\rangle _{\mathbb{C}} + \left\langle i \mathbf{b}, i \mathbf{d}\right\rangle _{\mathbb{C}} \\ &\stackrel{\text{(1)}}{=} \left\langle \mathbf{a}, \mathbf{c}\right\rangle _{\mathbb{R}} +i \left\langle \mathbf{a}, \mathbf{d} \right\rangle _{\mathbb{R}} - i \left\langle \mathbf{b}, \mathbf{c}\right\rangle _{\mathbb{R}} + \left\langle \mathbf{b}, \mathbf{d}\right\rangle _{\mathbb{R}} \end{align*} so we can simply use (1) as our definition (which turns out to satisfy all properties of a complex inner product). W.r.t. this inner product $\mathbf{A}$ would also be self adjoint again (so we can borrow the proof from above). Is this argumentation valid?
Here is an alternative that does not involve changing the structure of the original vector space.
Let $M$ be the matrix of $A$ relative to any orthonormal basis of $V$. Then $M$ is a symmetric $n\times n$ matrix over ${\mathbb R}$, where $n$ is the dimension of $V$.
Next we forget about $V$, and consider the linear transformation $B:{\mathbb C}^n\to {\mathbb C}^n$ given by the matrix $M$ (this time viewed as a complex matrix).
Then $B$ is a self-adjoint complex-linear operator on ${\mathbb C}^n$, and the complex theory guarantees that the roots of its characteristic polynomial coincide with the set of eigenvalues of $B$, all of which are real.
Finally, we just have to notice that the characteristic polynomial of $B$ is the same as that for $A$!