Take some $d,C\in\mathbb{C}$ and let
$$A_{n}=\sum_{m=0}^{n}\binom{n}{m}\left(1+\frac{m}{n}\right)^{d}C^{m}.$$
I want to calculate the asymptotics as $n$ tends to $\infty$. Numerically in a few examples and proven for $d\in\mathbb{Z}_{\geq0}$ it has leading order given by
$$A_{n}\sim (C+1)^{n}\left(1+\frac{C}{C+1}\right)^{d}.$$
My proof for $d\in\mathbb{Z}_{\geq0}$ used
$$A_{n}=(C+1)^{n}\sum_{k=0}^{n}\binom{d}{k}\left(\frac{C}{C+1}\right)^{k}\sum_{r=0}^{\infty}\frac{\binom{d-k}{r}(n)_{k}S(r+k,k)}{\binom{r+k}{r}n^{r+k}}$$
where $S(r+k,k)$ are the Stirling numbers of the second kind and as $d\in\mathbb{Z}_{\geq0}$ the sums are just over $k=0,...,d$ and $r=0,...,d-k$.
Any ideas would be great! Thanks in advance!
If you take out a factor $(C+1)^n$, this is
$$ (C+1)^n\sum_{m=0}^n\binom nm\left(1+\frac mn\right)^d\left(\frac C{C+1}\right)^m\left(\frac1{C+1}\right)^{n-m}\;. $$
Setting $p=\frac C{C+1}$ and $1-p=\frac1{C+1}$ shows that this is an expectation with a “complex-valued binomial distribution”:
$$ (C+1)^n\sum_{m=0}^n\left(1+\frac mn\right)^d\binom nmp^m(1-p)^{n-m}\;. $$
With increasing $n$, the distribution gets concentrated around the “mean” $np$, so we get a first approximation by substituting $p=\frac C{C+1}$ for $\frac mn$:
$$ (C+1)^n\left(1+\frac C{C+1}\right)^d\;. $$
This can be used to systematically derive the asymptotic expansion in inverse powers of $n$ by expanding the “random variable” in powers of the deviation from the mean:
\begin{eqnarray} \left(1+\frac mn\right)^d &=&\left(1+p+\frac mn-p\right)^d \\ &=& (1+p)^d\sum_{k=0}^\infty\binom dk\left(\frac{m-np}{n(1+p)}\right)^k\;, \end{eqnarray}
which yields the sum in terms of the central moments $\mu_k$ of the binomial distribution:
$$ (C+1)^n(1+p)^d\sum_{k=0}^d\binom dk(n(1+p))^{-k}\mu_k(n,p) \\ = (C+1)^n\left(1+\frac C{C+1}\right)^d\sum_{k=0}^d\binom dk\left(n\left(1+\frac C{C+1}\right)\right)^{-k}\mu_k\left(n,\frac C{C+1}\right)\;. $$
The first six central moments of the binomial distribution are given at Wikipedia, and a recurrence relation for them is derived in Closed-Form Expressions for the Moments of the Binomial Probability Distribution by Andreas Knoblauch. As might be expected from the linear increase of the variance with $n$, the $k$-th central moment grows with the $\left\lfloor\frac n2\right\rfloor$-th power of $n$, so that we need two central moments to get one inverse power of $n$ in the expansion. Up to $O \left(n^{-2}\right)$, with the well-known central moments $\mu_0=1$, $\mu_1=0$ and $\mu_2=np(1-p)$, the result is
$$ (C+1)^n\left(1+\frac C{C+1}\right)^d\left(1+\frac1n\binom d2\frac C{(2C+1)^2}+O\left(\frac1{n^2}\right)\right)\;. $$
Note that while the binomial distribution is usually only used for $p\in[0,1]$, the central moments are purely algebraic quantities that have the same algebraic form for any $p\in\mathbb C$.