Let $A$ be a linear map from a finite dimensional complex vector space to itself. If $A$ has finite order then the trace of its inverse is the conjugate of its trace.
I know two proofs of this fact, but they both require linear algebra facts whose proofs are themselves quite involved.
Since $A^n=I$, the eigenvalues of $A$ are roots of unity. Hence they have unit norm, and so their reciprocals are their conjugates. Then the result follows from following facts: (a) The eigenvalues of $A^{-1}$ are the reciprocals of the eigenvalues of $A$, (b) the dimensions of the eigenspaces of $A^{-1}$ are equal to the dimensions of the corresponding eigenspaces of $A$, (c) the trace is equal to the sum of the (generalised) eigenvalues. The proof of (a) is relatively easy, but (b) and (c) seem to require the existence of Jordan Normal Form, which requires a lot of work.
By Weyl's Unitary Trick, there's a inner product for which $A$ is unitary (this proof is itself a fair amount of work). So in an orthonormal basis (which we must construct with the Gram-Schmidt procedure) the inverse of $A$ is given by its conjugate transpose (one must also prove this). So the trace of the inverse is the conjugate of the trace.
Since the condition $A^n=I$ and the consequence $\mathrm{tr}(A^{-1})=\overline{\mathrm{tr}(A)}$ are both elementary statements, I'm wondering if there's a short proof from first principles (ideally without quoting any big linear algebra Theorems). Can anyone find one?
Since the question attracted quickly four votes, i'll try to use minimal known linear algebra to get the result. (One more comment. Since the complex conjugation is involved, there is no purely algebraic proof, e.g. one that uses polynomial/functional calculus in $A$.)
We start with $A$ a matrix, an endomorphism of a vector space $V$ of finite dimension $\ge 1$ over $\Bbb C$, such that for a suitable natural $n$ we have $$A^n=I\ .$$
Let $v\ne 0$ be a vector in $V$. The sequence $v, Av, A^2 v,\dots A^nv=v, \dots$ is periodic. Let $d$ be its period, $d$ is a divisor of $n$. If $d=1$ we record this $v$, set $w=v$. Else, let $\xi$ be a primitive $d$-root of unity in $\Bbb C$, e.g. $\xi=\exp\frac {2\pi\, i}d$ if we want to fix the ideas (and leave algebra). Consider the following vectors in $V$: $$ \begin{aligned} w_0 &=v +Av+\dots+A^{d-1}v\ ,\\ w_1 &=v +\xi Av+\dots+(\xi A)^{d-1}v\ ,\\ \ \vdots\ \vdots\ &\qquad \vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\\ w_k &=v +(\xi^k A)v+\dots+(\xi^k A)^{d-1}v\ ,\\ \ \vdots\ \vdots\ &\qquad \vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\vdots\qquad\\ w_{d-1} &=v +(\xi^{d-1} A)v+\dots+(\xi^{d-1} A)^{d-1}v\ . \end{aligned} $$ If at least one of these vectors is $\ne 0$, then we record it, and set $w$ to be one choice among them. Else?! Else we have the situation, which is formally described by the following relation: $$ \underbrace{ \begin{bmatrix} 1 & 1 & 1 & \dots & 1\\ 1 & \xi & \xi^2 &\dots & \xi^{d-1}\\ 1 & \xi^2 & \xi^4 &\dots & \xi^{2(d-1)}\\ \vdots &\vdots &\vdots &\ddots &\vdots\\ 1 & \xi^{d-1} & \xi^{2(d-1)} &\dots & \xi^{(d-1)(d-1)} \end{bmatrix} }_{\text{Vandermonde}(1,\xi,\dots,\xi^{d-1})} % \begin{bmatrix} v \\ Av\\ A^2 v\\\vdots\\A^{d-1}v \end{bmatrix} = \begin{bmatrix} 0 \\ 0\\ 0\\\vdots\\0 \end{bmatrix} \ . $$ The Vandermonde matrix is invertible, so we formally multiply from left with its inverse. To be exact, this is reflected then in building linear combinations in the given formulas for $w_0,w_1,\dots,w_{d-1}$ to isolate $0=v=Av=\dots$ , which gives a contradiction. Doing this we have constructed a $w$ such that $Aw=\xi^? w$ for a root of unit $\xi^?$. We consider $V'$, the quotient space or some subspace of $V$ generated by "the other" vectors that extend the linear independent system $\{w\}$ to a basis, and consider the same problem with the induced / restricted $A$ on $V'$.
Inductively we get a basis of $V$ on which $A$ acts diagonally (or upper triangular if taking quotients), and the elements on the diagonal are $\xi_1,\xi_2,\dots$ all of them roots of unit.
Then the inverse matrix has the same shape with diagonal $\xi_1^{-1},\xi_2^{-1},\dots$ and the equality involving traces can be equivalently traced back to: $$ \frac 1{\xi_1}+ \frac 1{\xi_2}+ \dots = \overline{\xi_1}+ \overline{\xi_2}+ \dots $$ which is true.