I read this short article http://www.math.jhu.edu/~smahanta/Teaching/Spring10/Stillwell.pdf. The goal of this article is to show the unsolvability of certain polynomials of degree $\geq 5$. But at the bottom of page 23 it says that any permutation $\sigma $ of $x_1,\dots , x_n$ extends to a permutation of $\mathbb{Q}(x_1,\dots ,x_n)$ defined by
$$\sigma f(x_1,\dots ,x_n)=f(\sigma(x_1), \dots , \sigma(x_n))$$ for each rational $f$. I'm not convinced that this is well defined. In fact, I'd say it's not. For example, it could be that you divide by zero.
Is anyone willing to read the first two pages of this article and think about this claim? I have multiple issues with this definition and I believe it's just wrong, even if you would impose more conditions on it. The author of this article is a very good mathematician, so I do not rule out the option of me having some kind of brainfart and not understanding what he tries to explain.
Thank you in advance!
Edit: The $x_i$'s are roots of some fixed polynomial, not variables. You can read the first two pages (easy) of the article for the proper notations.