I'm still having trouble understanding why showing there are strictly more uncountable sets than countable ones took as long as it did. It seems like there are no shortage of trivial ways to show this can't be true, although that could be my naivete or the benefit of hindsight.
Some of the simpler approaches I've come up with do fail when you point out that you can do things like put $\mathbb N$ and $\mathbb Q$ in bijection using a little cleverness. However, the key insight seems to be that all of those cases involve putting a finite amount of information in bijection with another finite amount of information. Assigning a natural number to every possible infinitely long binary sequence would require a bijection between a finite amount of information and a potentially infinite amount of information.
This strikes me as an absurd proposition, so I'm surprised it took so long for a convincing proof to be constructed and for consensus to be reached on it. Off the top of my head, this would mean that a valid bijection would let you encode or reference an infinite amount of information with a finite amount. Even if you could never do that in practice (since you'd need a ledger containing infinitely many entries with infinite information and other pesky problems), it raises alarm bells even in theory; there's the immediate pigeonhole principle objection of trying to cram all that infinite data into finite holes, although I suppose information theory wasn't around then either.
If you were trying to argue that there were more rationals than naturals, you could say "let 1 map to 1, 2 map to 2, and so on" and then point out that no numbers were left over for all the numbers in between. This fails because the numbers you're mapping to are rationals themselves, and can be more efficiently encoded by mapping naturals to pairs of naturals.
The same argument doesn't seem to apply to mapping between $\mathbb N$ and $\mathbb R$, though. If we agreed that some number, like $\pi$, had no better representation than the number itself, then mapping from $1$ to $\pi$, $2$ to $2\pi$ and so on would use up your supply of naturals. It needn't be $\pi$, or even a well-defined number, just so long as you posit that some real number fits that description. Specifically, it seems that any real number which can only be fully described using an infinite amount of information would suffice.
Or perhaps try the bijection $n \leftrightarrow \sqrt[n]{n}$. For that not to exhaust our supply of $\mathbb N$, wouldn't it require being able to fold $\sqrt[n]{n}$ into some other finitely-expressible form which includes... well, pretty much everything else commonly expressible?
To be fair, I do think Cantor's diagonal argument is a strikingly elegant way to have put this whole thing to bed, so credit where credit's due.
If someone can explain why this problem is harder than it appears, or at least why my arguments aren't sufficient (particularly the $\pi$ one), I'll accept it as an answer.
It is not really accurate to say that proving that there are uncountable sets took a long time. Instead, this is one of those situations where the definitions are the crux. The definition of cardinality and the proof that there are different infinite cardinalities came in the same publication. One could even argue that the first person to seriously think about the issue already came up with the diagonal argument -- and thus there is no basis to consider it difficult to find.
The second of the 19th century is when mathematicians started to figure out the foundations, so its also not the case that set theory as a whole was late to the party.
As a side note, your arguments may indicate that you haven't see the diagonal argument in its proper form (for sets):
Let $\phi : X \to \mathcal{P}(X)$ be a function from a set to its powerset. Then $\{x \in X \mid x \notin \phi(x)\} \in \mathcal{P}(X)$ is not in the range of $\phi$, hence $\phi$ cannot be a surjection.