It is a common knowledge, that problems from graph theory/combinatorics can be often solved via short and astonishing probabilistic reasoning. In contrast to this, it is much harder to find such "out of the box"/"WOW effect" solutions for analysis, topology or algebra problems. So, what are your favorite problems like this, how do you prove them?
This question was inspired by a very nice analysis problem:
evaluate $ \ \lim_{n\to \infty} e^{-n} \cdot \sum_{k=0}^{n}\frac{n^k}{k!}$.
with the following, extremely beautiful solution:
This is $P[N_n\leqslant n]$ where $N_n$ is a random variable with Poisson distribution of parameter $n$. Hence each $N_n$ is distributed like $X_1+\cdots+X_n$ where the random variables $(X_k)$ are independent and identically distributed with Poisson distribution of parameter $1$.
By the central limit theorem, $Y_n=\frac1{\sqrt{n}}(X_1+\cdots+X_n-n)$ converges in distribution to a standard normal random variable $Z$, in particular, $P[Y_n\leqslant 0]\to P[Z\leqslant0]$.
Finally, $P[Z\leqslant0]=\frac12$ and $[N_n\leqslant n]=[Y_n\leqslant 0]$ hence $P[N_n\leqslant n]\to\frac12$, QED.
Both, question and solution come from Evaluating $\lim\limits_{n\to\infty} e^{-n} \sum\limits_{k=0}^{n} \frac{n^k}{k!}$.
I think that such a list might be helpful for educational purposes. Such problems act on the imagination of students/listeners and much more likely raise their interest.
Here are two nice (and somewhat related) applications of probabilistic proofs. They require more sophisticated machinery than your example though.
Fundamental Theorem of Algebra: Every non constant polynomial $p\in\mathbb C[X]$ has a root in $\mathbb C$.
Consider a Brownian motion $B_t$ on $\mathbb C$, and suppose $p$ had no complex roots. Then $f=\frac{1}{p}$ is analytic and bounded. As $f$ is continous and non-constant, we can find reals $\alpha<\beta$ such that $\{\Re f(z)\leq\alpha\}$ and $\{\Re f(z)\geq\beta\}$ both contain (disjoint) open discs.
Then $\Re f(B_t)$ is a bounded martingale, so by the martingale convergence theorem, it converges almost surely. However, Brownian motion is recurrent on $\mathbb C$, so $B_t$ visits $\{\Re f(z)\leq\alpha\}$ and $\{\Re f(z)\geq\beta\}$ infinitely many times. So $$\lim\inf\Re f(z)\leq\alpha<\beta\leq\lim\sup\Re f(z),$$ which contradicts martingale convergence.
If $f:\mathbb Z^2\to[0,\infty)$ satisfies $$f(x,y)=\frac{f(x+1, y)+f(x, y+1) + f(x-1, y) +f(x, y-1)}{4}$$ for all $x,y\in\mathbb Z$, then $f$ is constant.
Let $X_n$ be a simple symmetric random walk on $\mathbb Z^2$. The condition implies that $M_n=f(X_n)$ is martingale. As $f\geq0$, we know by the martingale convergence theorem that $M_n$ converges almost surely.
But $(X_n)$ is irreducible and recurrent, so (almost surely) visits every point in $\mathbb Z^2$ infinitely often. So $M_n$ hits every value in the image of $f$ infinitely often. This contradicts the convergence of $M_n$ unless $f$ is constant.
Away from probabilistic arguments, I will always be partial to the Gaussian integral. I particularly like two proofs: the first is the standard change of variables to polars. The second is applying the Feynman trick to $$F(t)=\left(\int_0^te^{-x^2}\mathop{}\!\mathrm{d}x\right)^2.$$ I also like Axler's proof that every linear operator $T$ on a finite-dimensional vector space $V$ over $\mathbb C$ has an eigenvalue. He doesn't define need to define determinants/characteristic polynomials, and instead just considers $v, Tv,\dots, T^nv$, where $n=\dim V$. These vectors are linearly independent, so take a linear combination that equals $0$, and you are done by the fundamental theorem of algebra.