I'm trying to calculate in a program the number of possible unique subsets of a set of unique numbers, given the subset size, using the following formula:
$\dfrac{n!}{(n-r)!r!}$
The trouble is, on the face of it, you may need an enormous structure to hold the dividend (at least). Is there a way of simplifying this calculation, so that it can be calculated using something smaller like $64$-bit integers, or are you really going to have to store numbers like $60!$?
Alternatively, is there a formula that is more suited to computing the aforementioned value?
As mentioned by others, the binomial coefficient
$$\binom{n}{r}=\frac{n!}{r!(n-r)!}$$
is already so fundamental a quantity, that it is itself considered a canonical form. Nevertheless, you seem to be asking how one might compute this, while avoiding unsavory events like integer overflow.
Since this is for programming purposes, if you can stand to use a two-dimensional array, you might be interested in the doubly-indexed recurrence relation
$$\binom{n}{r}=\binom{n-1}{r}+\binom{n-1}{r-1}$$
along with the initial conditions $\tbinom{n}{0}=\tbinom{n}{n}=1$, or the singly-indexed recurrences
$$\begin{align*} \binom{n}{r}&=\frac{n}{n-r}\binom{n-1}{r}\\ \binom{n}{r}&=\frac{n-r+1}{r}\binom{n}{r-1} \end{align*}$$
One other possibility is to instead deal with the logarithms of the factorials, since $\log(ab)=\log\,a+\log\,b$ and $\log(a/b)=\log\,a-\log\,b$. You might want to look into Stirling's formula if you want to take this route, but this is only intended for the cases where $n$ and $r$ are large.