The only approaches I have seen to answering this question have involvd manipulating $e^{iH}$ like an ordinary exponential. i.e.
If $U=e^{iH}$, then $U^\dagger = (e^{iH})^\dagger = e^{-iH^\dagger}=e^{-iH}$
Therefore $UU^\dagger = e^{iH}e^{-iH}= e^0 = 1$
I have written the proof out above as I have seen it, although I presume it woul be more correct in the last line to say $e^\textbf{0}=I$ where $\textbf{0}$ is the zero matrix.
I am not sure to what extent this constitutes a valid proof. It seems to me that we are assuming many things about exponentials raised to matrices, which are not immediately obvious to me.
Since $e^A$ for some matrix $A$ is defined through the power series $\Sigma_{n=0}^{\infty} \frac{A^n}{n!}$, I am more inclined to proving this through the power series approach. Using this, I get (omitting a few trivial steps)
$UU^\dagger = e^{iH}e^{-iH}= \Sigma_{n=0}^{\infty} \Sigma_{m=0}^{\infty} \frac{(iH)^n}{n!} \frac{(-iH)^m}{m!} = \frac{1}{2} \Sigma_{n=0}^{\infty} \Sigma_{m=0}^n \frac{(iH)^{n-m}}{n!m!}$
Although I am not sure that the final step is correct (I have seen this replacement for a double summation in the context of electrostatics, but then the term for $n=m$ was zero, which it is not here. So I am not quite sure how to deal with that.
My overall questions here are as follows:
a) Is the proof using only exponentials valid? Specifically, the step that assumes $e^{iH}e^{-iH}= e^\textbf{0}$? (The rest can be gotten easily from the power series expansion).
b) Is the last step of the summation in my incomplete proof correct? I realise this question isn't entirely related to the original question, so I am happy to ask it on a new post.
c) Whether or not the last line is correct, I am stuck with the summations and don't know how to show that they all cancel expect for a single $I$ term... One thing I had in mind was considering the summation as a contraction between an antisymmetric tensor- the $(iH)^{n-m}$- and a symmetric tensor $1/{n!m!}$, although I am not sure about this at all. I don't think the first is antisymmetric...
Instead of answering your questions directly, let me just add some comments on the proof in general that might be useful. The statement you want to prove hinges on two facts:
Depending on how careful you want to write your proof you might (or might not) want to include the proof of these two facts.
The first property might seem obvious, but it doesn't hold when $A$ and $B$ doesn't commute. This property can be proved using the power-series defintion of the matrix exponential, the binomial theorem for $(A+B)^n$ and the Cauchy formula for the product of two power-series. A second way is to define $f(t) = e^{(A+B)t} - e^{At}e^{Bt}$ and show that $f'(t) = 0$ and $f(0) = 0$. And I'm sure there are other ways.
The second property follows directly from the power-series definition of the matrix expoential and the fact that $(A^n)^\dagger = (A^\dagger)^n$ which can be proven by first showing it for $n=2$ and then using induction.
Here is one example of a way to proving the first property in some detail. To do so we introduce a parameter $t$ to help us with book-keeping (we consider it as a power-series in $t$)
$$\matrix{e^{tA}e^{tB} &=& \sum_{n,m=0}^\infty \frac{A^n}{n!}\frac{B^m}{m!} t^{n+m} & \text{(definition of matrix exp.)}\\&=& \sum_{p=0}^\infty \frac{t^p}{p!}\left(\sum_{k=0}^p{p\choose k}A^k B^{p-k}\right) & \text{(rewrite as standard power-series)}\\ &=& \sum_{p=0}^\infty \frac{t^p}{p!}(A+B)^p & \text{(binomial theorem)}\\ &=& e^{t(A+B)} & \text{(definition of matrix exp.)}}$$
Note that in the binomial theorem only holds if $A$ and $B$ commute (for example $(A+B)^2 = A^2 + AB + BA + B^2$ is seen to equal $A^2 + 2AB + B^2$ only when $AB = BA$).