Let $L=\{0,1,+,-, \cdot \}$.
Show: for every $L$-term $t=t(x_1,x_2, \cdots, x_n)$, there exists a polynomial $P_t \in \mathbb{Z}[X_1, \cdots, X_n]$ such that for every communities unitary ring $\mathcal{R} $ viewed as an $L$-structure, we have $t^{\mathcal{R}}=P_t$. And conversely, every polynomial $P \in \mathbb{Z}[X_1, \cdots, X_n] $ is of the from $P_t$ for some L-term $t=t(x_1,x_2, \cdots, x_n)$.
For the first part of the question, we are given a $L$-term $t=t(x_1,x_2, \cdots, x_n)$. By definition of a $L$-term, we know
every variable is an $L$-term.
every constant symbol of $L$ is an $L$-term
every function in $L^F$ is an $L$-term.
But I'm not sure what exactly we are trying to show here. Is it clear that by 3), we are done?
And for the converse statement, I'm not sure what I'm supposed to show either.
Any hints or ideas would be greatly appreciated!
Thanks!
I think it's worth taking a second to discuss what "terms" are in this context. From your question it looks like you might be slightly confused about this, and it's causing you to overcomplicate the rest of the question.
A term in a language $\mathcal{L}$ is (as you said) built up from variables and the constant/function symbols of the language.
I feel this is best given as a definition by examples, so let's take a second to look at some:
In $\mathcal{L}_\text{group} = (\cdot, {}^{-1}, e)$, we have terms like
Notice terms coincide with a certain kind of function. Once we have a group $G$ in mind, then the term $x^{-1}y^{-1}xy$ defines a function $G^2 \to G$ where we simply substitute the actual group elements for the variables and simplify.
Now let's look at $\mathcal{L}_\text{ring} = (-, +, \times, 0, 1)$. Ideally, we should find that the terms are exactly the polynomials in with coefficients in $\mathbb{Z}$.
Do you see that these are all polynomials? If you play around with some terms, you should quickly get an intuitive understanding for why you always get a polynomial. Of course, intuition does not make a proof, so how can we formally show that the terms are always $\mathbb{Z}$ polynomials?
Since terms are defined recursively, we will prove things about them inductively. As you mentioned, terms are
So let's try to show that each of these is a polynomial with integer cofficients!
the function $x$ is indeed a polynomial with coefficients in $\mathbb{Z}$, and indeed every variable is automatically a polynomial of degree 1.
The constant symbols are $0$ and $1$ both of which correspond to (degree 0) polynomials with coefficients in $\mathbb{Z}$
Lastly, if $t_1$ and $t_2$ are terms, we must show that $t_1 + t_2$, $-t_1$, and $t_1 \times t_2$ are all polynomials too. By induction, $t_1$ and $t_2$ are both polynomials with coefficients in $\mathbb{Z}$. Then negating, adding, and multiplying polynomials gives another polynomial.
So by induction on the structure of terms, we're done ^_^
What about the other direction, though? Why can every polynomial with coefficients in $\mathbb{Z}$ be written as a term?
Each polynomial looks like the sum of things of the form $a x_1^{n_1} x_2^{n_2} \ldots x_k^{n_k}$. Since we know we're allowed to take sums, it suffices to show that each of these is a term. But these all look like products of variables (which we can make) and some integer $a$.
So it suffices to show we can find a term for every integer $a$... Of course, we have access to a constant $1$ and an operation for negation. Do you see how to finish the proof from here?
I hope this helps ^_^