Show that affine algebraic set is decreasing

35 Views Asked by At

Let $k[x_{1},...,x_{n}]$ be the polynomial ring in $n$ variables over the field $k$ .

Define $V(J)=\{x\in{\mathbb A}^{n}:f(x)=0~, \forall f\in J~\}$ , where ${\mathbb A}^{n}$ is affine space of dimension $n$ over $k$ and $J$ is an ideal of the polynomial ring $k[x_{1},...,x_{n}]$ .

We show that $V$ is decreasing , that is , $V(J)\subseteq V(I)$ whenever $I\subseteq J$

My attempt :

Suppose to the contrary that there exist some point $x\in V(J)-\color{red}{V(I)}$ , thus there are some polynomials $f\in J$ and $g\in I$ that $f(x)=0$ and $g(x)\ne0$ .

Note that $gf\in IJ\subseteq I$ since $I\subseteq J$ and $I,J$ are both ideals . Moreover , $(gf)(x)=g(x)f(x)=0$ . Hence by definition that $x\in V(I)$ , a cotradiction that $x\notin \color{red}{V(I)}$

Can anyone check my proof for validity ? Any comment or advice will be appreciated . Thanks for considering my request .

1

There are 1 best solutions below

1
On BEST ANSWER

The problem starts with “there are some polynomials $f$ and $g$ such that..”

Rather, there is a polynomial $g\in I$ such that $g(x) \neq 0$, and for every polynomial $f\in J$, $f(x) = 0$.

The second problem is with the conclusion. You found a polynomial in $I$ that vanishes at $x$. This doesn’t contradict the assumption $x \notin V(I)$.

There is a much simpler approach. Take $x \in V(J)$. Then for every polynomial $f\in J$, $f(x) = 0$. Since $J$ contains $I$, we get that for every polynomial $f\in I$, $f(x) = 0$ and therefore that $x \in V(I)$.

P.S. With the correct reasoning, your approach can be made into a different proof.