Suppose $A$ is a possibly complex square $n \times n$ matrix. Then $A$ has, including multiplicity, $n$ eigenvalues $\lambda_1, \dots, \lambda_n$. The matrix exponential $e^{A}$, also an $n \times n$ matrix has eigenvalues $\mu_1, \dots, \mu_n$. It is clear, following the definition of the matrix exponential and its absolute convergence, that $e^{\lambda_i}$ are included among $\mu_1, \dots, \mu_n$.
However, is it also true that the multiplicities are the same? In other words, are $(\mu_i)$ and $(e^{\lambda_i})$ (slightly abusing notation here), equal as multisets? If so, how can we see that? I guess part of the work is to show that for each $j$ there is some $i$ such that $\mu_j = e^{\lambda_i}$.
Quoting my comment:
This argument can be made fully rigorous as follows.
Step 1: Show that the algebraic multiplicities match for diagonalizable matrices, by computing that the exponential of a diagonal matrix $\text{diag} \{ \lambda_i \}$ is another diagonal matrix $\text{diag} \{ \exp(\lambda_i) \}$.
Step 2: Show that the characteristic polynomial $\det(\lambda I - \exp(M))$ of the exponential of a matrix $M$ is a continuous function of it (since it's a composition of two continuous functions). This says, concretely, that each coefficient of the characteristic polynomial is a continuous (and even smooth, and even analytic) function of the matrix entries of $M$.
Step 3: Show that a matrix $M$ can be approximated by a sequence of diagonalizable matrices $D_i$ whose eigenvalues converge to those of $M$. This can be done most explicitly by using the Jordan normal form of $M$ and perturbing its diagonal entries slightly so that they are all distinct.
Step 4: Approximate an arbitrary matrix $M$ by diagonalizable matrices $D_i$ as above and use the continuity of the characteristic polynomial to argue that, since the characteristic polynomials $\det(\lambda I - \exp(D_i))$ converge to $\det(\lambda I - \exp(M))$ and since, by construction, the eigenvalues of the $D_i$ converge to those of $M$, it follows that $\det(\lambda I - \exp(M)) = \prod_{i=1}^n (\lambda - \exp(\lambda_i))$ as desired.
The point of writing the argument in this slightly indirect way is to avoid having to prove that the roots of a polynomial are a continuous function of its coefficients; since the roots don't come in a prescribed order, especially over $\mathbb{C}$, it's a little tricky to make this rigorous (although it can be made rigorous and it is true), and so it's easier to just argue with the characteristic polynomial instead.
This density argument is a powerful technique and makes precise the intuition that statements like this "have to be true" because they're true for diagonalizable matrices. The same or similar density arguments can prove lots of other facts, for example that if $A, B$ are two square matrices then $AB$ and $BA$ have the same characteristic polynomial (note that they aren't necessarily similar).
This particular density argument shows more generally that if $f$ is a holomorphic function then the eigenvalues of $f(M)$ are $f$ applied to the eigenvalues of $M$, with matching algebraic multiplicities; this is a slight enhancement of the spectral mapping theorem for the holomorphic functional calculus.