I'm looking for examples of combinatorics problems which would be very difficult to solve by direct enumeration, but can be easily solved using ideas from probability, like independence, commuting sums and expectations, etc. I know I have seen such problems before, especially in the HMMT combinatorics subject tests, but I can't now recall any good examples.
I am NOT looking for probabilistic existence proofs (the so-called "probabilistic method" introduced by Erdos). The sort of problems I'm interested in are enumerative.
Graph enumeration problems are often solved using the theory of random graphs. For example, if you want to asymptotically count the number of (labeled) connected graphs on $n$ vertices, it is easiest to estimate the probability that the Erdős–Rényi graph $\mathcal G_{n,1/2}$ is connected, then multiply by $2^{\binom n2}$, the total number of labeled graphs on $n$ vertices.
For a fancier example, the asymptotic number of labeled $d$-regular $n$-vertex graphs can be counted by a similar probabilistic strategy. If $d$ is sufficiently small as $n \to \infty$, we analyze the configuration model (which generates a random $d$-regular multigraph by picking a random matching between the $dn$ half-edges) and estimate the probability that the graph is simple. For an example of the newest and fanciest results in this vein, see this paper by Liebenau and Wormald.
The fundamental ideas in such cases are often about proving asymptotic independence between many very unlikely events, which lets us make Poisson or Gaussian approximations to random variables that approach the truth as $n \to \infty$. This would be difficult to recast in a purely deterministic form, and the result would be very hard to understand.
By contrast, in a problem where the "trick" was independence or commuting sums and expectations, the probability is essentially flavor and you could recast such a proof as taking a sum in two ways or something similar.