A quick look at the wikipedia entry on mathematical constants suggests that the most important fundamental constants all live in the immediate neighborhood of the first few positive integers. Is there some kind of normalization going on, or some other reasonable explanation for why we have only identified interesting small constants?
EDIT: I may have been too strong in some of my language, or unclear in my examples. The most "important" or "interesting" constants are certainly debatable. Moreover, there are many important and interesting very large numbers. Therefor I would like to make two revisions.
First, to give a clearer idea of the numbers I had in mind, please consider such examples as $\pi$, $e$, the golden ratio, the Euler–Mascheroni constant, the Feigenbaum constants, the twin prime constant, etc. Obviously numbers like $0$, $1$, $\sqrt2$, $...$, while on the wikipedia list, are in some sense "too fundamental" for consideration.
This leads me to my second revision, which is that the constants I am trying to describe are (or appear to be) irrational. Perhaps this is a clue to what makes them interesting. At the very least, it leads me to believe that large integer counterexamples do not satisfy the question as I had intended.
Finally, if I could choose a better word to describe such numbers, it might be "auspicious" rather than interesting or important. But I don't really know if that's any better or worse.
Without waxing too metaphysical, I think that in addition to some "fundamental truths of nature" type answers, there are probably some anthropomorphic reasons partially explaining this observation. We spend most of our waking hours dealing with numbers less than, say, a couple of thousand, so it's not surprising that most of our most amazing observations concern numbers in this range. It seems likely that as mathematics and technology progress, we will find ourselves discovering amazing properties of ever-increasingly large numbers. Indeed, one of the most amazing numbers ever,
$$808017424794512875886459904961710757005754368000000000,$$
had to wait until the 1970s before its significance was even conjecturally understood. (Edit to add that this number is the order of the monster group, the mathematics behind which couldn't possibly be properly addressed in this answer. But wikipedia is a good start.)
Also edited to add a response to GregL that was becoming too long for comments. I see your point but ultimately still disagree. It's hard to make this precise (and so much for not waxing metaphysical), but say we lived in a universe where the ratio of the circumference of a circle to its diameter was on the order of a billion, instead of the current universe's ratio of $\pi$. Then we might not have ever even noticed that this ratio was constant across all circles, so in a sense it's only because $\pi$ is small that we were led to observe and hence calculate it. (Okay, $\pi$'s not the best example of this, but you see the point). So in response to a general claim of "The important numbers just turn out to be small when we calculate them," my answer above is roughly the argument that it's instead the case that small numbers self-select to even be calculated in the first place! I think the monster group order fits perfectly into this narrative -- it is a number representing a tremendous amount of fundamental truth, but it was impossible to know its significance before developing the mathematics and noticing the patterns that forced it to reveal itself.