Is any finite-dimensional extension of a field, say $F$, algebraic and finitely generated?

8.8k Views Asked by At

As title, Is any finite-dimensional extension of a field, say $F$, algebraic and finitely generated?

Say if $K/F$ is a finite extension when $K$ is a finite-dimensional vector space of $F$. Clearly, this implies that $K$ is finitely generated (as an algebra) over $F$, since a basis is a generating set. So every finite extension is finitely generated.

So indeed they all are, is my logic correct?

4

There are 4 best solutions below

2
On BEST ANSWER

By definition, a field extension of finite degree is finitely generated because the degree is the number of linearly independent "vectors" in the extension required to form a spanning set.

Now let $K/F$ be a finite extension. Consider an arbitrary $a \in K$ and the evaluation homomorphism $\text{ev}_a:F[x] \rightarrow K$ defined such that $g(x) \mapsto g(a)$. This map cannot be injective because $K$ is finitely generated whereas $F[x]$ is not, so the kernel of this map must be a nontrivial ideal. This is to say: we can find nontrivial polynomials in $F[x]$ with $a$ as a root for any $a \in K$, namely the nonzero elements of $\ker(\text{ev}_a)$. In other words, $K/F$ is an algebraic extension.

4
On

A similar but easier answer to the one given above is as follows.

To show $K/F$ is algebraic if finite we must show that every element of $K$ satisfies a polynomial over $F$.

Suppose $[K : F] = n$ and choose $\alpha\in K$. Then consider the elements $1,\alpha,\alpha^2,...,\alpha^n$.

This is a list of $n+1$ elements in an $n$ dimensional $F$-vector space so must be linearly dependent. Thus there exists $a_0,a_1,...,a_n\in F$ not all zero such that $a_n \alpha^n + ... + a_2\alpha^2 + a_1\alpha + a_0 = 0$.

But then $\alpha$ is a root of the polynomial $a_nx^n + ... + a_2x^2 + a_1x + a_0$ over $F$.

0
On

Other answers provide nice proofs, here is a very short one based on the multiplicativity of the degree over field towers: If $ K/F $ is a finite extension and $ \alpha \in K $, then $ F(\alpha) $ is a subfield of $ K $, and we have a tower of fields $ F \subseteq F(\alpha) \subseteq K $. The Tower Law then asserts that

$$ [F(\alpha):F][K:F(\alpha)] = [K:F] $$

Since $ K/F $ is finite, by definition the RHS is finite. On the other hand, it is seen from the above equality that $ [F(\alpha):F] $ is less than or equal to $ [K:F] $, therefore is also finite. By definition, $ \alpha $ is algebraic over $ F $.

0
On

As other have said, by definition, a field extension $K$ over $F$ is of finite degree if and only if it is finitely generated, because its degree $n$, by definition, is the minimum number of vectors $v_1,v_2,...v_n \in K$ (from the field extension $K$) that are sufficient to form a spanning set over the underlying field $F$. I.e. if the degree is finite, then a finite number of vectors are enough, and so our extension is finitely generated and conversely, if we can finitely generate our extension, then by definition, the field extension / vector space $K$ has a finite degree equal to the number of vectors we need.

However, regarding the second part of your question about all finite extensions being algebraic:

I believe a lot of common proofs I have seen for this statement online rest on unjustified premises. Therefore, I will now share what I think is an elegant, elementary and self-sufficient actual proof from:

https://proofwiki.org/wiki/Size_of_Linearly_Independent_Subset_is_at_Most_Size_of_Finite_Generator

Let $K$ be a finitely generated vector space / extension over $F$. This means that we can find a finite number of vectors in $K$ that will span $K$ by drawing coefficients over $F$.

Let $ v_1, v_2, ..., v_n$ be such a spanning set for $K$ over $F$ (called a basis for $K$ over $F$). This means, we define $n$ to be the lowest positive integer for the number of vectors that are required to span $K$ over $F$.

Let $x_1, x_2, ..., x_r \in K $ be a sequence non-zero vectors s.t. $r>n$. We will prove that these vectors must be linearly dependent by contradiction! So let us begin by assuming that they are indeed an independent collection of vectors.

We know that there exist $a_1, a_2, ..., a_n \in F$ s.t.

$$ x_1 = a_1v_1 + a_2v_2 + ... + a_nv_n$$

Therefore the elements in the finite sequence $\{x_1, v_1, v_2, v_3, ..., v_n\}$ are linearly dependent. Now consider this beautiful idea: starting from the left-most term and going right-wards, let "$a_i$" be the first term from this ordered sequence that can be expressed as a linear combination of some terms strictly to the left of itself! This term can not be $x_1$ as we have positioned it as the first term in the sequence and so by definition, it cannot be composed of any elements preceding it (as no entries precede it).

Note: Since $x_1$ can, however be expressed as a linear combination of some of the subsequent terms of our sequence (which span $K$), we know that in any case, we will eventually encounter some $i \in \{1,2,3,...,n\}$ s.t. $a_i$ will satisfy our criterion. If $1 \leq j_1 < j_2 < ... j_f \leq n$ are the integer indices s.t. $a_{j_1},a_{j_2},...,a_{j_f} \in F$ are all non-zero and: $$ x_1 = a_{j_1}v_{j_1} + a_{j_2}v_{j_2} + ... + a_{j_f}v_{j_f}$$

holds. Then exactly $i = j_f$ will be the index that we are looking for, as we can write $v_{j_f}$ as a linear combination of the following preceding terms:

$$ a_{j_f}^{-1}(x_1 - a_{j_1}v_{j_1} - a_{j_2}v_{j_2} - ... - a_{j_{f-1}}v_{j_{f-1}})= v_{j_f} = v_i$$

We will throw this now superfluous term "$v_i$" out of our sequence and we will still have a spanning collection of $n$ vectors (you can easily show that this new sequence of vectors yields a basis). So now we have $S_1 = \{x_1, v_1, v_2, v_3, ..., v_n\}$.

Now, inductively, we repeat the same procedure. Consider the sequence of $n+1$ terms: $\{x_2,x_1, v_1, v_2, ...,v_{i-1},v_{i+1},..., v_n\}$. We repeat our argument: $x_2$ can not be a linear combination of any vectors to the left of it as there are no preceding terms in our sequence. On a different note, now: by our assumption of independence for the vectors $x_1,x_2,...,x_n$, $x_1$ cannot possibly be a linear combination of the vectors coming before it. Eventually we must, however come across the term we are searching for as $x_2$ can be written as a linear combination of terms from $S_1$. So we drop the redundant term and we get $S_2 = \{x_2,x_1, v_1, ..., v_n\}$ which is, again, $n$ terms long, as we have added $x_1$ and $x_2$ (2 terms) and have dropped 2 of our original basis vectors.

Inductively, we then repeat this argument until we get our final sequence of basis vectors $S_n = \{x_n,x_{n-1},...,x_2,x_1\}$. Now, recall that we have assumed that $r>n$, so we still have some vectors $x_{n+1},x_{n+2},...,x_{r}$ left. However, since our vectors from the final sequence $S_n$ span $K$, we can express any of these remaining extra vectors as a linear combination of the vectors $x_1,...,x_n$. Therefore, we have directly contradicted our assumption that $x_1,...,x_r$ are linearly independent. Therefore, we have shown that for a finite extension of degree $n$, any collection of $r$ vectors where $r>n$ are going to be linearly dependent. Now letting $\alpha \in K$, we see that a non-trivial solution to:

$0 = a_0 + a_1\alpha + a_2\alpha^2 + ... + a_n\alpha^n$

must exist by the linear dependence of $1,\alpha, \alpha^2, ..., \alpha^n$ (as here we have $n+1 > n$ terms). Hence every term of a field extension of finite degree is algebraic; i.e. a finitely generated extension / an extension of finite degree is algebraic.