This is an exercise in the second chapter of Fulton & Harris's 'Representation theory - A first course' Book ;
"Show that if we know the character $X_V$ of a representation $V$, then we know the eigenvalues of each element $g$ of $G$, in the sense that we know the coefficients of the characteristic polynomial of $g:V \rightarrow V$. Carry this out explicitly for elements $g\in G$ of orders $2, 3$, and $4$, and for a representation of $G$ on a vector space of dimension $2, 3$, or $4$ ."
This is a computational challenge.
So we can work out $dim V=2$ and $dim V=3$ , I am a little stuck for $dim V=4$ , in this case the characteristic polynomial is $$P(\lambda)= \lambda^4 - X_V(g)\lambda^3+ a_2\lambda^2- a_3\lambda + det(A_g)$$
where $a_2= \sum_{i<j}^4\ \lambda_i\lambda_j$ (all the 2-terms)
and $a_3= \sum_{i<j<k}^4\ \lambda_i\lambda_j\lambda_k$ (all the 3-terms) , where $\lambda_1, \lambda_2, \lambda_3, \lambda_4$ being the eigenvalues of $g$
Also $det(A_g)= \lambda_1\lambda_2\lambda_3\lambda_4$
We need to compute these $3$ coefficients for elements of order $2, 3$ and $4$
- For elements of order $2$ : using $g^2=e$ and $g= g^{-1}$ we can have
$a_2= \frac{X_V(g)^2-4}{2}$ and $a_3= det(A_g)X_V(g)$ and $det(A_g)=\pm 1$
So here is there a way to know whether $det(A_g)= 1$ or $-1$ ? - For elements of order $3$: using $g^3=e$ and $g^2= g^{-1}$ we can have
$$a_2= \frac{X_V(g)^2}{2}- \frac{a_3}{2\ det(A_g)}$$ and $$ \left(6+ 3\frac{X_V(g)}{det(A_g)}\right)a_3= 8+ X_V(g)^3$$ Also of course $det(A_g)=e^{ik\frac{2\pi}{3}}$ for some $k\in \{0,1,2\}$
So is there a way to know $det(A_g)$ ?
Also we need $X_V(g)\neq -2\ det(A_g)$ for the second formula to give $a_3$
You can imagine how this will be for elements of order $4$ ! I must be missing some clue to make the computations easier ? ...
Assume that $\dim V=4$. Assume that $\chi_V$ is known as a function $G\to\Bbb{C}$. Assume that $\operatorname{ord}_G(g)=m$. The eigenvalues are roots of unity, satisfying the equation $\lambda^m=1$. We use the character values to determine the multiplicity of each solution as an eigenvalue.
Case $m=2$. The eigenvalues of $g$ are roots of unity of order two. That is, $\lambda_j\in\{+1,-1\}$ for $j=1,2,3,4$. If $+1$ occurs $k$ times, then $-1$ occurs $4-k$ times, and the characteristic polynomial is $$P(\lambda)=(\lambda-1)^k(\lambda+1)^{4-k}.$$ The point is that we can solve for $k$ because we know $$\chi_V(g)=k\cdot1+(4-k)\cdot(-1)=2k-4.$$ Therefore the following table exhausts all the possibilities: $$ \begin{array}{c|c} \chi_V(g)&P(\lambda)\\ \hline 4&(\lambda-1)^4\\ 2&(\lambda-1)^3(\lambda+1)\\ 0&(\lambda-1)^2(\lambda+1)^2\\ -2&(\lambda-1)(\lambda+1)^3\\ -4&(\lambda+1)^4 \end{array} $$
Case $m=4$. This time the eigenvalues of $g$ are in the set $\{+1,i,-1,-i\}$. If $+1$ occurs $a$ times, $i$ with multiplicity $b$, $-1$ with multiplicity $c$ and $-i$ with multiplicity $d$, then we know that $a+b+c+d=4$ and that $$\chi_V(g)=(a-c)+(b-d)i.$$ This time the known value of $\chi_V(g)$ won't determine the numbers $a,b,c,d$ uniquely. But, given the above, we can deduce that $g^2$ has $+1$ as an eigenvalue with multiplicity $a+c$ and $-1$ with multiplicity $b+d$. As in the first case, knowing $\chi_V(g^2)$ hence tells us what $a+c$ and $b+d$ are. The real (resp. imaginary) part of $\chi_V(g)$ tells us the values of $a-c$ (resp. $c-d$). Of course, when we know all of $a+c,a-c, b+d$ and $b-d$ we can solve for all four multiplicities.
In other words, only when the absolute values of the real and imaginary parts of $\chi_V(g)$ add up to $4=\dim V$ can we read the multiplicities of the eigenvalues from $\chi_V(g)$ alone. If some pairs of eigenvalues cancel each other (either $i-i=0$ or $1-1=0$), then we need help from $\chi_V(g^2)$ as it tells us the number of real (resp. purely imaginary) eigenvalues.
For example $\chi_V(g)=1+i$ may come from either $1+1-1+i$ or $1+i+i-i$. In those two cases we have $\chi_V(g^2)=1+1+1-1=2$ or $1-1-1-1=-2$. So knowing $\chi_V(g^2)$ tells us which kind of a cancellation took place.
The number of cases is higher, and the table is a bit longer. $$ \begin{array}{c|c|c} \chi_V(g^2)&\chi_V(g)&P(\lambda)\\ \hline 4&4&(\lambda-1)^4\\ 4&2&(\lambda-1)^3(\lambda+1)\\ 4&0&(\lambda-1)^2(\lambda+1)^2\\ 4&-2&(\lambda-1)(\lambda+1)^3\\ 4&-4&(\lambda+1)^4\\ 2&3+i&(\lambda-1)^3(\lambda-i)\\ 2&3-i&(\lambda-1)^3(\lambda+i)\\ 2&1+i&(\lambda-1)^2(\lambda+1)(\lambda-i)\\ 2&1-i&(\lambda-1)^2(\lambda+1)(\lambda+i)\\ 2&-1+i&(\lambda-1)(\lambda+1)^2(\lambda-i)\\ 2&-1-i&(\lambda-1)(\lambda+1)^2(\lambda+i)\\ 2&-3+i&(\lambda+1)^3(\lambda-i)\\ 2&-3-i&(\lambda+1)^3(\lambda+i)\\ 0&2+2i&(\lambda-1)^2(\lambda-i)^2\\ 0&2&(\lambda-1)^2(\lambda-i)(\lambda+i)\\ 0&2-2i&(\lambda-1)^2(\lambda+i)^2\\ 0&2i&(\lambda-1)(\lambda+1)(\lambda-i)^2\\ 0&0&(\lambda-1)(\lambda+1)(\lambda-i)(\lambda+i)\\ 0&-2i&(\lambda-1)(\lambda+1)(\lambda+i)^2\\ 0&-2+2i&(\lambda+1)^2(\lambda-i)^2\\ 0&-2&(\lambda+1)^2(\lambda-i)(\lambda+i)\\ 0&-2-2i&(\lambda+1)^2(\lambda+i)^2\\ -2&1+3i&(\lambda-1)(\lambda-i)^3\\ -2&1+i&(\lambda-1)(\lambda-i)^2(\lambda+i)\\ -2&1-i&(\lambda-1)(\lambda-i)(\lambda+i)^2\\ -2&1-3i&(\lambda-1)(\lambda+i)^3\\ -2&-1+3i&(\lambda+1)(\lambda-i)^3\\ -2&-1+i&(\lambda+1)(\lambda-i)^2(\lambda+i)\\ -2&-1-i&(\lambda+1)(\lambda-i)(\lambda+i)^2\\ -2&-1-3i&(\lambda+1)(\lambda+i)^3\\ -4&4i&(\lambda-i)^4\\ -4&2i&(\lambda-i)^3(\lambda+i)\\ -4&0&(\lambda-i)^2(\lambda+i)^2\\ -4&-2i&(\lambda-i)(\lambda+i)^3\\ -4&-4i&(\lambda+i)^4 \end{array} $$