Prove a set $V$ is not algebraic

658 Views Asked by At

I need help showing that the set $V = \{(a,b) \in \mathbb{A}^2(\mathbb{C})\ \vert \ \vert a\vert^2 + \vert b\vert^2 = 1\}$ is not an algebraic subset of complex affine 2-space.

I believe that I can just choose two complex numbers $a$ and $b$ with moduli that sum to 1 and just show that $(a,b) \neq 0$. For example, if I choose $a = i$ and $b=0$, then $$\vert a\vert^2 + \vert b\vert^2 = \vert i\vert^2 + \vert 0\vert^2 = 1^2 = 1,$$ but $(a,b) = (i,0) \neq (0,0)$, and we are done.

Is this approach correct? Thanks in advance.

2

There are 2 best solutions below

1
On BEST ANSWER

If this set were algebraic, then its intersection with the set $\{b=0\}$ would be an algebraic subset of $\Bbb A^1(\Bbb C)$. But that intersection is $\{a\in\Bbb C:|a|^2=1\}$. But apart from $\Bbb A^1(\Bbb C)$ itself, the algebraic subsets of $\Bbb A^1(\Bbb C)$ are all finite.

1
On

Whether or not $(a,b) = (0,0)$ has nothing to do with anything. For example $\{xy = 1\}$ is algebraic and doesn't contain $(0,0)$. Recall that a set $V$ is algebraic if it is the vanishing set of some collection of polynomials.

There are a couple strategies to show that a set is not algebraic. First, you can take a polynomial that vanishes on $V$ and show that it must always vanish at some other point not in $V$ (where "must always" means this is true for any polynomial that vanishes on $V$, not just a specific one). Think about why this is. Suppose $V = V(f_1,\dots,f_r)$ and each $f_i$ vanishes at some other point not in $V$.

Another strategy is to consider dimensions and degrees and stuff like that. A common example for this is the set $\{(t,\sin(t)) : t \in \mathbf{R}\} \subseteq \mathbf{R}^2$. Then when you intersect this with the line $\{y = 0\}$ you get infinitely many points. But what sort of polynomial has infinitely many zeroes?

So, what sort of polynomials vanish on your set? That is the first question you want to ask.