Problem with a proof where algebraic extensions are assumed to be finite extensions

40 Views Asked by At

I'm reading the article "Integration in Finite Terms" by Maxwell Rosenlicht and I have a problem with one step in a proof. Rosenlicht wants to prove the following: If $F$ is a differential field of characteristic zero and $K$ an algebraic extension field of $F$, then the derivation on $F$ can be extended to a derivation on $K$ and this extension is unique. After proving uniquess, Rosenlicht then continues as follows: "We now show that such a [differential field] structure on $K$ exists. Using the usual field-theoretic arguments, we may assume that $K$ is a finite extension of $F$, so that we can write $K=F(x)$, for a certain $x\in K$."

Not being an expert in the theory of fields, I don't understand which "usual" arguments he's talking about. An algebraic extension isn't necessary finite, so why can we assume this here?

1

There are 1 best solutions below

5
On BEST ANSWER

Let $K/F$ be an algebraic extension and let $\mathscr F$ denote the set of finite subextensions of $K/F$ that's the set of subfields $E$ of $K$ containing $F$ such that $E/F$ is a finite extension.

For all $E\in\mathscr F$ your text proves the existence of one and only one derivation $d_E:E\to E$ extending that given on $F$. Since $K=\bigcup\mathscr F$, there exists one (and only one) function $d:K\to K$ such that $d|E=d_E$ for all $E\in\mathscr F$ and this is the required derivation.