It is well know that there are many bases for the ring of symmetric polynomials on $k$ variable. For example, if we define $p_n = \sum_{i=1}^k x_i^n$, then every symmetric polynomial can be written as a finite sum of finite products of the $p_n$.
My question is: if we want to impose some polynomial conditions on the variables $x_1,\dots,x_k$, how does this change the basis of symmetric functions?
For example, if we take $k=2$ and impose the relation $x_1x_2=1$, then we see that every symmetric polynomials can be written as a finite sum of $p_n$ (without products necessary). That is, if $f$ is a symmetric polynomials then $$f(x_1,x_2) = \sum_{1\leq i\leq j\leq N} a_{i,j}\left(x_1^ix_2^j + x_1^jx_2^i\right) = \sum_{1\leq i \leq j \leq N} a_{i,j} \left( x_1^{j-i} + x_2^{j-i} \right)= \sum_{1\leq i \leq j \leq N} a_{i,j} p_{j-i}(x_1,x_2)$$
Is there a general theory that tackles these questions?
Edit: I am interested in the smallest possible (whatever smallest may mean) set of polynomials that would additively generate the symmetric polynomials under a condition. For example, with no condition we would need $\prod_j p_j^{n_j}$ for all choices of $n_j\geq0$, whereas with the example condition given we only need $p_n$ for all choices of $n>0$.