Why is it okay to write..."for $n\in\mathbb{Z}^+,n\ge 3.5$"?

106 Views Asked by At

So I've been playing around with some mathematical induction proofs and I usually open with a statement similar to: (example below)

Denote the statement involving $n$ for $n\in\mathbb{Z}^+,n\ge 4$ $$S(n): 2^n<n!$$

Notice you could also write $n\ge 4$ as $n>3$ because we're playing in the positive integers. Though they have a no real difference between the two, they appear differently. Now, why is it okay to write...

for $n\in\mathbb{Z}^+,n\ge 3.5$

when 3.5 does NOT exist in the universe in which $n$ is in? It seems like 3.5 is sort of undefined.

Question: Why is it okay to write..."for $n\in\mathbb{Z}^+,n\ge 3.5$"?

3

There are 3 best solutions below

0
On

There is a canonical way to embed the integers into the rationals that preserves ordering.

That makes it okay to write $n \ge 3.5$. It can be interpreted as $f(n) \ge 3.5$ where $f : \mathbb Z \to \mathbb Q$ is the canonical embedding.

Of course it's a little unusual to write that, it would be much clearer and more kind to your readers to write $n \ge 4$.

0
On

If $n \in \mathbb{Z}^+$, why is it okay to say that $n$ is positive? "Positive" means "greater than $0$" and $0 \notin \mathbb{Z}^+$, so we're comparing $n$ with something outside of its universe, just like in your example.

And if $a, b \in \mathbb{Z}^+$, why is it okay to say that $a$ is at least half of $b$, or $a \ge \frac{b}{2}$? $b$ might be odd.

The answer is that this type of notation is okay whenever the meaning is easily understood, especially if it saves space or somehow simplifies things for the reader. Your example $n \ge 3.5$ is awkward because it means the same thing as $n \ge 4$, so it makes interpretating the statement more difficult, although the meaning is clear enough.

If you program, it might help to think of it like implicit typecasting. In the C language, if $n$ is declared as an int, when evaluating $n >= 3.5$ it will be converted to a double.

0
On

Everything you're talking about lives in the universe of real numbers. Any real number $x$ has the property that it is either greater than or equal to $3.5$, or less than it. Every natural number is a real number. So there's nothing wrong with talking about those natural numbers which are greater than or equal to $3.5$.

I would also like to address what was mentioned in the comments about it 'wrong' to write something like $n \geq 3.5$. Sure, it's much more clear to write $n \geq 4$. But in number theory, it is very common to describe natural numbers as being bounded by irrational numbers: in any number field $K$, every member of the ideal class group has a representative by an integral ideal $I$ for which $$N_{K/\mathbb{Q}}(I) \leq \sqrt{\textrm{disc}(K/Q)} (\frac{4}{\pi})^{s} \frac{n!}{n^n}$$ where $n = [K : \mathbb{Q}]$ and $s$ is half the number of complex embeddings of $K$ into $\mathbb{C}$. Now, who wants to write that bound as $$\lfloor \sqrt{\textrm{disc}(K/Q)} (\frac{4}{\pi})^{s} \frac{n!}{n^n} \rfloor$$