I am currently working on trying to prove that $$\lim_{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right)^n = \sum_{k=0}^\infty \frac{1}{k!}.$$
Here is my progress so far.
$$\lim_{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right)^n = \lim_{n \rightarrow \infty} \sum_{k=0}^n \frac{n!}{k! (n-k)!} \left(\frac{1}{n} \right)^n$$
$$\lim_{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right)^n = \lim_{n \rightarrow \infty} \sum_{k=0}^n\frac{1}{k!} \cdot \frac{n}{n} \cdot \frac{n-1}{n} \cdot \frac{n-2}{n} \cdots \frac{n-(k-1)}{n}$$
$$\lim_{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right)^n = \lim_{n \rightarrow \infty} \sum_{k=0}^n\frac{1}{k!} \left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right )$$
Since for all $k > n$, we have $\left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right ) = 0$, since one of the factors of the form $\left( 1 - \frac{i}{n} \right )$ is guaranteed to be zero when $k > n$, we can make the sum go to infinity without affecting the result.
$$\lim_{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right)^n = \lim_{n \rightarrow \infty} \sum_{k=0}^\infty \frac{1}{k!} \left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right )$$
$$\lim_{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right)^n = \lim_{n \rightarrow \infty} \lim_{m \rightarrow \infty} \sum_{k=0}^m \frac{1}{k!} \left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right )$$
At this point, I'd like to be able to swap the limits, which would allow the summation to reduce to $\sum \frac{1}{k!}$ as desired. To my knowledge, this is only possible when the conditions of the Moore-Osgood Theorem are satisfied, which say that if $f(m,n) = \sum_{k=0}^m \frac{1}{k!} \left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right)$ then we must have either $\lim_{m\rightarrow \infty} f(m,n)$ or $\lim_{n\rightarrow \infty} f(m,n)$ be uniformly convergent in order to be able to swap the $m \rightarrow \infty$ and $n \rightarrow \infty$ limits above.
If we define $g(n) = \sum_{k=0}^n \frac{1}{k!} \left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right )$, it is fairly evident that $\lim_{m \rightarrow \infty} f(m,n) = g(n)$ pointwise, but only uniform convergence would be sufficient for the Moore-Osgood Theorem.
To show uniform convergence here, one would need to prove that $\forall \varepsilon > 0 : \exists M : \forall m \geq M : \forall n$ we have that $|f(m,n) - g(n)| < \varepsilon$. I managed to deduce that
$$|f(m,n) - g(n)|= \sum_{k=m+1}^n \frac{1}{k!} \left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right ) \text{if } m< n \text{ and } 0 \text{ otherwise.}$$
At this point I feel like I must be really close, but my brain is a bit fried trying to figure out how to show that this is less than $\varepsilon$ for any arbitrary choice of $\varepsilon > 0$ under all possible values of $m \geq M$ and $n$ for a chosen $M$. Is this even possible and could anyone provide a possible next step or hint of where to go from here?
To avoid confusion, the expression $\left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right )$ should be replaced with
$$\alpha_{kn} = \begin{cases}1, &k=0,1\\ \left( 1 - \frac{1}{n} \right )\left( 1 - \frac{2}{n} \right ) \cdots \left( 1 - \frac{k-1}{n} \right ), &2\leqslant k \leqslant n\\0,&k > n\end{cases},$$ and using the binomial theorem we have
$$x_n :=\left(1+\frac{1}{n}\right)^n = \sum_{k=0}^n\frac{n!}{k!(n-k)!}\frac{1}{n^k}\\= 1+1 + \frac{1}{2!} \left(1 - \frac{1}{n} \right)+ \ldots +\frac{1}{n!} \left(1 - \frac{1}{n} \right)\cdots \left(1 - \frac{n-1}{n} \right)=\sum_{k=0}^n \frac{\alpha_{kn}}{k!}$$
You are trying to prove that $\lim_{n \to \infty} x_n = \sum_{k=0}^\infty \frac{1}{k!} = e$. Since you prefer viewing this in terms of a double limit, we define (using your notation)
$$f(m,n) = \sum_{k=0}^m \frac{\alpha_{kn}}{k!},$$
and since $\lim_{n\to \infty}\alpha_{kn} = 1$ for $k\leqslant m $ it follows that
$$\tag{1}\lim_{n \to \infty}f(m,n) = \sum_{k=0}^m \frac{1}{k!} $$
Also, since $0 \leqslant \alpha_{kn}\leqslant 1$, it follows for all $m < n$ that
$$\tag{2}f(m,n) \leqslant x_n = f(n,n) \leqslant \sum_{k=1}^n \frac{1}{k!}$$
Hence, one iterated limit of $f(m,n)$ is
$$ \tag{3}\lim_{m \to \infty}\lim_{n \to \infty}f(m,n) = \lim_{m \to \infty}\sum_{k=0}^m \frac{1}{k!} = \sum_{k=0}^\infty \frac{1}{k!}$$
No swapping of limits is needed to prove the desired result since from (2) we have
$$\lim_{n \to \infty} f(m,n) \leqslant \liminf_{n \to \infty}x_n \leqslant \limsup_{n \to \infty}x_n \leqslant \lim_{n \to \infty} \sum_{k=0}^n\frac{1}{k!} = \sum_{k=0}^\infty \frac{1}{k!},$$
and using (3) it follows that
$$\sum_{k=0}^\infty \frac{1}{k!} \leqslant \liminf_{n \to \infty}x_n \leqslant \limsup_{n \to \infty}x_n \leqslant \sum_{k=0}^\infty \frac{1}{k!}$$
This proves that the limit of $x_n$ exists and
$$\lim_{n \to \infty} \left(1+\frac{1}{n}\right)^n = \lim_{n \to \infty}x_n = \sum_{k=0}^\infty \frac{1}{k!}$$
Regarding uniform convergence
Even though it is not needed, we can show that $f(m,n)$ converges uniformly as $m \to \infty$.
Since $\alpha_{kn} = 0$ for $k > n$, we have
$$\lim_{m \to \infty}f(m,n) = \sum_{k=0}^\infty \frac{\alpha_{kn}}{k!} = \sum_{k=0}^n \frac{\alpha_{kn}}{k!}$$
Hence, for $n \leqslant m$
$$\left|\sum_{k=m+1}^\infty\frac{\alpha_{kn}}{k!} \right|= 0$$
and for all $n > m$
$$\left|\sum_{k=m+1}^\infty\frac{\alpha_{kn}}{k!} \right| < \sum_{k=m+1}^\infty \frac{1}{k!}$$
Since the RHS is the tail of a convergent series that does not depend on $n$ uniform convergence follows.