Prove the following limit: $$ \lim_{n\to\infty} \left(1+\sum_{k=1}^n \frac{(-1)^k}{k!}\right) = {1\over e} $$
The instrumentation I have so far is: $$ \lim_{n\to\infty}\left(1+{1\over n}\right)^n = e \\ \lim_{n\to\infty}\left(1-{1\over n}\right)^n = {1\over e} \\ \lim_{n\to\infty}\sum_{k=0}^n {1\over k!} = e $$
I would like to keep away from using: $$ \lim_{n\to\infty} \sum_{k=0}^n {x^k \over k!} = e^x $$ because that is what I essentially want to prove. Derivatives (Taylor series) are not allowed as well. This problem is from limits section.
My argumentation is as follows. Define $x_n$: $$ x_n = 1 + \sum_{k=1}^n \frac{(-1)^k}{k!} $$ Define $y_n$: $$ y_n = \left(1-{1\over n}\right)^n $$
Let's see whether $x_n$ is convergent. Try Cauchy Criterion for $m>n$: $$ \begin{align} |x_n - x_m| &= \left|\sum_{k=n+1}^m{(-1)^k\over k!}\right| \\ &= \left|\sum_{k\ \text{even}}^m{1\over k!} - \sum_{k\ \text{odd}}^m{1\over k!}\right| \\ &\le \sum_{k=n+1}^m{1\over k!} \\ &\le \sum_{k=n+1}^m{1\over k(k-1)} \\ &= {1\over n} - {1\over m} \\ &< {1\over n} < \epsilon \end{align} $$
So $x_n$ converges and $y_n$ is also known to be convergent. Consider the difference: $$ z_n = x_n - y_n $$ Rewrite $y_n$ using Binomial expansion: $$ y_n = \left(1-{1\over n}\right)^n \\ = 1 + \sum_{k=1}^n {n\choose k}\left({-1\over n}\right)^k \\ = 1 + \sum_{k=1}^n(-1)^k\frac{\prod_{i=1}^k(n-(i-1))}{k!n^k}\\ = 1 - 1 + {1\over 2!}\left(1-{1\over n}\right) - {1\over 3!}\left(1-{1\over n}\right)\left(1-{2\over n}\right) + \cdots + {1\over n!}\left(1-{1\over n}\right)\cdot \cdots \cdot \left(1-{n-1\over n}\right) $$ So: $$ z_n = {1\over 2!} - {1\over 2!}\left(1-{1\over n}\right) - {1\over 3!} + {1\over 3!}\left(1-{1\over n}\right)\left(1-{2\over n}\right) + \cdots $$
Now if we let $n\to\infty$ then $z_n$ tends to $0$: $$ \lim_{n\to\infty}z_n = 0 $$
Which by convergence of $x_n$ and $y_n$ means: $$ \lim_{n\to\infty}x_n = \lim_{n\to\infty}y_n = {1\over e} $$
This one is monstrous and I have doubts whether it's valid. So I'm kindly asking to confirm my reasoning is fine and/or point to mistakes. Also I would appreciate if someone could share a more elegant proof. Thank you!
Another way: Using Cauchy product we have $$\sum_{k \geq 0} \frac{(-1)^k}{k!} \cdot \sum_{k \geq 0} \frac{1}{k!} = \sum_{n \geq 0} \frac{1}{n!} \sum_{k = 0}^n (-1)^k {n \choose k}.$$ Now one only has to show that for $n > 0$ we have $$\sum_{k = 0}^n (-1)^k {n \choose k} = 0$$ which follows easily from the binomial theorem, i.e. $$0 = (1 + (-1))^n = \sum_{k = 0}^n {n \choose k} 1^{n-k} (-1)^k = \sum_{k = 0}^n (-1)^k {n \choose k}.$$ For $n = 0$ the sum is obviously equal to $1$. Thus $$\sum_{k \geq 0} \frac{(-1)^k}{k!} \cdot e = \sum_{k \geq 0} \frac{(-1)^k}{k!} \cdot \sum_{k \geq 0} \frac{1}{k!} = 1$$ which shows $$\sum_{k \geq 0} \frac{(-1)^k}{k!} = \frac{1}{e}.$$