Why is it that if $[L:K]$ is finite, then $[L:K]$ is algebraic?

722 Views Asked by At

If $[L:K]$ is finite, then $[L:K]$ is algebraic.

The author's proof is to simply state that given $[L:K] = n$, then ${1, x, x^2, ... , x^n}$, for any $x$ in $L$, would contain $n+1$ elements, so it must be linearly dependent over $K$.

First of all, why must ${1, x, x^2, ... , x^n}$ be linearly dependent over $K$? I do undertsand that if $L=K(a_1, ... , a_m)$, and we pick an non-algebraic $a_i$, then ${1, a_i, a_i^2, ... , a_i^n}$ would be, by definition, linearly dependent over $K$. But how can we know that this works with all the elements in $L$?

Second of all, even if we find that $c_0+ c_1x+c_2x^2 + ... + c_{m-1}≠0$, for some $x$ in $L$, $c_i$ in $K$, why would this mean that $[L:K]=∞$?


I asked this question a couple days ago, though I didn't explain my doubt in such detail as in this post. Anyway, the replies asked if I understood the meaning of being 'linearly dependent over a field $F$'. My understanding is that a set ${b_1, ... , b_n}$ is linearly independent over $F$ if each element in $F$ can be written as $f_1b_1+...+f_nb_n$ (so that the different combinations of $b_i$ form the whole $F$) and that the only way $f_1b_1+...+f_nb_n=0$ is for all the $b_i$ to be $0$ (so that we make sure all the elements in the set are needed to map create $F$).

I guess they asked me that question because is rather obvious why the author's proof works. Yet I don't know why, but it isn't that obvious to me.


I would really appreciate any help/thoughts.

4

There are 4 best solutions below

2
On BEST ANSWER

Let me first rephrase the definition of linear (in)dependence.

Let $V$ be an $F$-vector space and let $s_0,s_1,\ldots s_n$ be elements of $V$. Then the set $S=\{s_i\}$ is linearly dependent if and only if there exists $a_0,\ldots, a_n\in F$ that are not all zero (though some can be) such that $\sum a_is_i=0$. We define a set to be linearly independent when it is not linearly dependent.

What you seem to be forgetting is that $L$ is a $K$-vector space of dimension $[L:K]$.

Fix any $\alpha\in L$, and recall that $\alpha$ is a vector over $K$. It follows immediately that $\{1,\alpha,\ldots \alpha^n\}$ is linearly dependent because it's a list of $n+1$ vectors in an $n$-dimensional vector space. Therefore $\exists a_0,\ldots a_n\in K$ such that $\sum a_i\alpha^i=0$, where at least one $a_i$ is non-zero. However, this statement tells us that there's a polynomial of degree $n$ in $K[x]$ that has $\alpha$ as a root, because that polynomial is simply $p(x)=\sum a_ix^i$. Since every element of $L$ is the solution to a polynomial in $K[x]$, $L$ is an algebraic extension of $K$.

0
On

First, we define the degree of L over K to be its dimension as a K-vector space. This means that any set of elements of size larger than the dimension must necessarily be linearly dependent. In particular, take $a \in L$ and consider the set $\{1,a,a^2,...,a^n\}$. This contains n+1 elements so it must be linearly dependent over K. Using the coefficients from the linear dependence, define a polynomial in $K[x]$, which has $a$ as a zero.

0
On

You don't have the definition of "linear independence" right. Since you seem confused about that and about some other things, I'm providing a rather long-winded answer. There are other correct shorter ones here.

$b_1, \ldots, b_n$ in your "definition" below the rule are vectors in some (unnamed) vector space over the field $F$. Every linear combination of those vectors is in that vector space. No linear combination is in $F$. Independence means that the only combination that gives the $0$ vector is the one with all coeffiecents the $0$ element of $F$.

Above the rule, the field is $K$ and the unnamed vector space is $L$, which you can think of as a vector space over $K$. If it has dimension $n$ as a vector space then every set of $n+1$ vectors is linearly dependent. In particular, for each $a \in L$ the powers of $a$ from $a^0$ up to $a^n$ (those powers are using the field multiplication in $L$) must satisfy a linear relationship $$ k_0 + k_1a + \cdots + k_na^n = 0 $$ with the $k_i \in K$. Then $a$ is a root of the polynomial $$ k_0 + k_1x + \cdots + k_nx^n $$ so it's algebraic over $K$. Of course the polynomial depends on the $a$ you started with. What the argument shows is that there is such a polynomial for every $a$. That settles your "first of all".

Your "second of all" is ambiguous. If for some $x \in L$ it's true that for all $n$ there are no coefficients $c_i$ that work then $x$ is transcendental over $L$ and the powers of $x$ are an infinite independent set and $L$ is of infinite degree over $K$.

0
On

First question:

$[L:K]$ is the dimension of $L$ as a $K$-vector space. If this dimension is finite it is the maximal number of linearly independent elements. Hence, if $x$ is not a root of unity in $L$, the infinite set $\{1,x,x^2, \dots, x^n,\dots$} cannot be linearly independent.

Second question:

It's not clearly formulated. I suppose you take as hypothesis that $c_0+c_1x+\dots+c_mx^m\ne 0$ for an $x$ annd any finite list of coefficients $(c_0,c_1,\dots, c_m)$. Well this means you have finite lists of linearly independent elements $(1,x,\dots, x^m)$ of any length. This is exactly what one has to check to prove a vector space is infinite-dimensional.