I am studying tensor algebra and came across tensor invariants, and how they are composed of its eigenvalues. What is known is that these invariants are the coefficients of the characteristic polynomial; for a $3 \times 3$ matrix $A$, these are: $$I_1 = Tr(A) = \lambda_1 + \lambda_2 + \lambda_3$$ $$I_2 = (Tr(A)^2 - Tr(A^2)/2 = \lambda_1\lambda_2 + \lambda_2\lambda_3 + \lambda_3\lambda_1$$ $$I_3 = \det(A) = \lambda_1\lambda_2\lambda_3$$
Looking at the eigenvalues, this pattern seems to generalize for $n \times n$ matrices: $I_i$ is the sum of all possible permutations of products of $i$ eigenvalues, which comprises $_nP_i$ terms. (Note that $I_0 = 1$ could also be considered an invariant, as it is the coefficient before $\lambda^n$.) Is this way of combining variables known to mathematicians? It looks like an exotic "mean" that interpolates between summing and multiplying.
On the other hand, focusing on the traces and determinants, $I_2$ resembles a "trace-variance", in analogy with the usual variance from statistics which employs expected values. Here I have similar questions: does this trace-function crop up in other areas of mathematics? Furthermore, I recall that a $\det(A)$ in general can be expressed as a sum of traces of powers of $A$ which means that all these terms are generalizations of this "variance". Is there more known about this? (Such as how to quickly derive the trace-de-dependent formulation of some $I_i$.) References would also be very appreciated.
Addendum: I noticed another pattern in that $(Tr(A)^2 - Tr(A^2))/2$ is the determinant of $A$ if it were a $2 \times 2$ matrix; the same obviously holds for $Tr(A)$ being equal to $\det(A)$ for a $1 \times 1$ matrix; and this pattern generalizes. Does this mean anything?