Consider two (Hermitian) matrices $A$ and $B$. Is there a nice expression for the following?
$$ \boxed{ \frac{\mathrm d}{\mathrm d x} \exp\left( A + x B \right)\big|_{x=0} = \; ? }$$
Of course, if $A$ and $B$ commute, this is simply $B \exp{(A)}$.
One thing I tried was the Suzuki-Trotter formula: \begin{align} \boxed{\frac{\mathrm d}{\mathrm d x} \exp\left( A + x B \right)\big|_{x=0}} &= \frac{\mathrm d}{\mathrm d x} \left. \left( \lim_{N \to \infty} \left[ \exp\left( \frac{A}{N} \right) \exp \left( x \frac{B}{N} \right) \right]^N \right) \right|_{x=0} \\ &= \lim_{N\to \infty} \sum_{n=1}^N \exp\left( \frac{n}{N} A \right) \frac{B}{N} \exp\left( \frac{N-n}{N} A \right) \\ &= \left( \lim_{N \to \infty} \frac{1}{N} \sum_{n=1}^N e^{\frac{n}{N}A }B\; e^{-\frac{n}{N}A } \right) e^A \\\ &= \boxed{ \int_0^1 e^{t A} B \;e^{(1-t)A} \; \mathrm d t } \; . \end{align} Is this as close as it gets to a closed form?
One thing we can do is go to the eigenbasis of $A$, such that we can explicitly perform the integration over $t$. If we index the eigenvectors of $A$ by $i$, with corresponding eigenvalues $\lambda_i$, then we can express the answer in this basis: \begin{equation} \boxed{ \left( \frac{\mathrm d}{\mathrm d x} \exp\left( A + x B \right)\big|_{x=0} \right)_{ij} = \frac{e^{\lambda_i}-e^{\lambda_j}}{\lambda_i-\lambda_j} B_{ij}} \;, \end{equation} where $(\cdot)_{ij}$ are the entries of a matrix in the eigenbasis of $A$. (Note that if $\lambda_i = \lambda_j$, we replace $\frac{e^{\lambda_i}-e^{\lambda_j}}{\lambda_i-\lambda_j} \to e^{\lambda_i}$, which is also consistent with l'Hopital's rule.)
Not an answer, but here is an alternative, algebraic, elementary approach to the final formula you have. It rests on the observation that if $T:V\to W$ is a linear transformation, $(v_{\alpha})$ a basis for $V$, $(w_{\beta})$ a basis for $W$, and $(w^*_{\gamma})$ the corresponding dual basis, then if $[T]$ is the matrix of $T$ with respect to our bases, $[T]_{ij}=w_i^*Tv_j$.
By using the product rule for matrices that $\frac{d}{dx}(M(x)N(x))=M'(x)N(x)+M(x)N'(x)$ and induction, we get the formula
$$\frac{d}{dx}(M(x)^n)=\sum_{\substack{0\leq i,j \\i+j=n-1}}M(x)^iM'(x)M(x)^j.$$
Therefore $$\left.\frac{d}{dx}(A+Bx)^n\right\rvert_{x=0}=\sum_{i+j=n-1}A^iBA^j,$$ and so $$C:=\left.\frac{d}{dx}e^{A+Bx}\right\rvert_{x=0}=\sum_n\sum_{i+j=n-1}\frac{A^iBA^j}{n!}.$$
Let $u,v$ be left and right eigenvectors of $A$ so that $uA=\mu u$, $Av=\lambda v$. (Sidedness isn't necessary when working with symmetric matrices over $\mathbb R$, but I want to leave open the possibility of working with not-necessarily symmetric matrices). Then
$$uCv=(uBv)\sum_n\sum_{i+j=n-1}\frac{\mu^i\lambda^j}{n!}.$$
By the identity $a^n-b^n=(a-b)\displaystyle \sum_{i+j=n-1}a^ib^j$, we have
$$(\mu-\lambda)uCv=(uBv)\sum_n\frac{\mu^n-\lambda^n}{n!}=(uBv)(e^{\mu}-e^{\lambda}).$$
Actually, here is an extension of this idea which sort of gives a formula. Given $A\in \operatorname{GL}(V)$, define $L_A,R_A:\operatorname{End}(\operatorname{GL}(V))$ by $L_A(B)=AB, R_A(B)=BA$. Then $L_A$ and $R_A$ commute. We can then write
$$C=\sum_n\sum_{i+j=n-1}\frac{A^iBA^j}{n!}=\left(\sum_n\sum_{i+j=n-1}\frac{L_A^iR_A^j}{n!}\right)B.$$
If we multiply this on the left by $ad_A=(L_A-R_A)$, identical algebra as above yields $$[A,C]=e^{L_A}(B)-e^{R_A}(B)=e^AB-Be^A=[e^A,B].$$
This formula only determines $C$ up to a map commuting with $A$, but there be another way to to make use of this to find a formula for $C$ itself.