Taken from Programming Languages and Principles, Chapter 1, by Tucker and Noonan:
... the assignment statement
$X = X + 1$
makes no sense either in functional programming or in mathematics.
Why does it make no sense in mathematics?
Taken from Programming Languages and Principles, Chapter 1, by Tucker and Noonan:
... the assignment statement
$X = X + 1$
makes no sense either in functional programming or in mathematics.
Why does it make no sense in mathematics?
On
I wouldn't say that such a statement makes no sense in mathematics. If $X$ is finite, it is a bit more justifiable. However, in the setting of (transfinite) cardinal arithmetic, if we have $X=\aleph_0$, then $X=X+1$ is certainly true. In fact, more than that is true: $$\aleph_\alpha + \aleph_\beta = \max(\aleph_\alpha,\aleph_\beta) \text{ for } \alpha,\beta\in\mathrm{Ord},$$ where $\mathrm{Ord}$ denotes the class of all ordinals.
Such a statement is true, in a sense, in (transfinite) ordinal arithmetic. For example, $\alpha + 1 \neq \alpha$ for $\alpha\in\mathrm{Ord}$. However, some care is still needed as $1 + \omega = \omega$.
Context matters.
In the context of mathematics, the equation "$X=X+1$" means "$X$ equals $X+1$", which is false (for example, substitution of $X=0$ gives $0=1$; substitution of $X=1$ gives $1=2$; and that's more than enough to see the falsity with crystal clarity).
In the context of programming languages, the assignment operator "$X=X+1$" means "take the numerical value stored in the memory position tagged $X$, copy that value into the arithmetic processor, add $1$ to the value in the arithmetic processor, and then copy the value in the arithmetic processor back into the memory position tagged $X$".
As you can see, these are incompatible meanings.