I've noticed certain arguments in analysis textbooks which rely on the principle of being able to pick elements infinitely many times. For example, an argument might go "Pick $x_1\in S$ such that $P(x_1)$. Having picked $x_1,x_2,\dots,x_k\in S$, pick $x_{k+1}$ such that $P(x_{k+1})$. This justifies the existence of a sequence in $S$ having properties X, Y, and Z." But this is not entirely induction; this is also recursion (and in some cases AC is used). And the recursion theorem/AC are totally absent from most textbooks. So how is this mathematical argument justified? Why doesn't the writer care about being explicit with the principles being used?
When I was an undergrad, this gave me anxiety.
This is basically a long comment: The thing is that when we construct sequences in the manner you described ("Having picked $x_1,x_2,\ldots,x_n\in S$, pick $x_{k+1}$,...") is much simpler to grasp than trying to make every single statement formal. Basically we are using the following theorem (or some suitable variant):
The proof of this can be left as an exercise for anyone who is advanced enough to care about this, and as far as I know depends on the Axiom of Choice. It can be done as follows (no reference): Say we are working with real numbers. Recall that a finite sequence is simply a function $f:\left\{1,\ldots,N\right\}\to\mathbb{R}$, and an infinite sequence is a function $f:\mathbb{N}\to\mathbb{R}$.
This is one of the things some people would call "part of the folklore". Something that basically everybody knows but noone took the time to write or really care about the details.