Naively, the answer seems to be yes. Our representation of numbers in a base-ten system requires that large integers take more digits. Large numbers also take up more space when we write them in binary, and can be used to encode more information on a computer.
But as all integers represent one instance out of the set of all possible integers, their informational entropy should be equivalent, as are sides of a die in a dice game.
So, do large integers carry more information than small integers, or does the interaction of integers with a base-ten system produce different amounts of information for different integers?
It will help to think of information as a measure of difficulty: roughly, "more information" means "harder to describe" (the connection being that in order to describe it you have to say more) or "harder to compress" (as a string).
In light of this it should be clear that size doesn't control information: the number $$X={10}^{{10}^{10^{10^{10^{10}}}}}$$is rather large, but quite simple to describe (I've just done it); by contrast, consider the number $$Y=4361748963429187634192343214123412345654678492734536.$$ This is - relatively speaking - tiny. But it's (at a glance, at least) much harder to describe.
Now, per your comment "Large numbers also take up more space when we write them in binary" - a key point here is that we get to choose how to describe the numbers in question. If I tried to write out $X$ in decimal notation, I'd be in trouble, but the point is that there is some way to write $X$ compactly. The issue with $Y$ is that there isn't any obvious way to "repackage" it in a simpler form. Maybe we're amazingly lucky and it's (say) the smallest counterexample to Goldbach's conjecture - which would give a relatively simple way to define it ("the smallest counterexample to Goldbach's conjecture") - but barring such surprises, $Y$ is in fact harder to describe (= contains more information) than $X$.