I'm reading a bit about Nelson's version of nonstandard analysis and in the notes it is said that [$n$ is standard]$\implies$[$n+1$ is standard]. Right after that it is mentioned that an inductive proof is impossible (using the fact $0$ is standard as the base) because it would be based on external sets. I have two questions about this:
My proof attempt: We're given a standard $n\in\mathbb{N}$ and the internal formula $A(x)$ defined by $[x=n+1]$ (emphasizing dependence on $x$ with $n$ being a parameter). By a version of the transfer principle $\forall^{\text{st}}n[\exists^{\text{st}}xA(x)\iff \exists xA(x)]$, $x=n+1$ can be chosen standard, and by the uniqueness of $n+1$ it necessarily is standard. Is this correct?
Since any object that can be uniquely described in internal mathematics is standard, why exactly is it untrue that every natural is standard? After all, no matter what $n\in\mathbb{N}$ i'm given, I can uniquely describe it using the typical $\left\{ \emptyset,\left\{ \emptyset \right\},\dots \right\}$.. What exactly is wrong about this approach of dealing with each natural number separately?
Your argument that $n+1$ is standard if $n$ is seems correct; I would suggest a different approach: show that evaluating a standard function at a standard value produces a standard value. Then this problem would follow by using $f(x) = x+1$.
For your second question, are you sure every natural number can be written as $\{ \varnothing, \{ \varnothing \}, \ldots \}$?
You're making a level slip here: every natural number within an IST-universe can indeed be expressed by an string of symbols $\{\},\varnothing$ one can construct within the IST-universe. But that does not imply it can be expressed by an external string of the same symbols.
(in fact, depending precisely on your choice foundations, you might even have standard natural numbers that can't be expressed by external strings)