Class field theory(Generalization to infinite extension)-Neukirch(General reciprocity law)

178 Views Asked by At

My Question ist regarding an exercise 4 in Neukirch Chapter IV § 6 General reciprocity law page 305.

Let $G$ be a profinite group and $A$ a $G$-module.

Let $(d:G\to \hat{\mathbb Z}, v:A_k\to \hat{\mathbb Z})$ be a class field theory. We assume the kernel $U_K$ of $v_K:A_K\to\hat{\mathbb Z}$ is compact for every finite extension $K\mid k$. For an infinite extension $K\mid k$, put $\hat A_K=\varprojlim A_{K_\alpha}$, where $K_\alpha\mid k$ varies over the finite extensions of $K\mid k$ and the projective limit is taken with respect to the norm maps $N_{K_\beta\mid K_\alpha}:A_{K_\beta}\to A_{K_\alpha}$. Now i have to show: If $L\mid K$ be a finite extension then there exists a inclusion map $i_{L\mid K}:\hat A_K\to\hat A_L$.

I need a hint how to define the map $i_{L\mid K}$. While it is quite easy to etablish a normal $N_{L\mid K}:\hat A_L\to \hat A_K$ i have no idea how to construct the inclusion map.

1

There are 1 best solutions below

0
On BEST ANSWER

Solution: . We define the inclusion map via $i_{L\mid K}(x)=y$ with $y_{L_\alpha}=\frac{[L:K]}{[L_\alpha:K_\alpha]}x_{K_\alpha}$, where $L_\alpha\mid k$ is a finite subextension of $L\mid k$ und $K_\alpha:=L_\alpha\cap K$. $K_\alpha\mid k$ is obviously a finite subextension of $K\mid k$. Let $L_\beta\mid k$ be another finite subextensions of $L\mid k$ such that $L_\alpha\mid k$ is a subextension then $K_\alpha\mid k$ is a subextension of $K_\beta\mid k$, where $K_\beta=K\cap L_\beta$. We have to show $N_{L_\beta\mid L_\alpha}(y_{L_\beta})=y_{L_\alpha}$. First note, for $z\in K_\beta\subset L_\beta$ we have \begin{equation*} N_{L_\beta\mid L_\alpha}(z)=N_{L_\beta\mid K_\beta L_\alpha}(N_{K_\beta\mid K_\alpha}(z))=[L_\beta: K_\beta L_\alpha]N_{K_\beta\mid K_\alpha}(z). \end{equation*} Thus we get \begin{equation*} N_{L_\beta\mid L_\alpha}(y_{L_\beta})=[L_\beta: K_\beta L_\alpha]N_{K_\beta\mid K_\alpha}(y_{L_\beta})=\frac{[L:K][L_\beta: K_\beta L_\alpha]}{[L_\beta:K_\beta]}x_{K_\alpha}. \end{equation*} We have $[L_\beta:L_\alpha]=[L_\beta: K_\beta L_\alpha][K_\beta:K_\alpha]$ on the other hand we have $[L_\beta:K_\beta][K_\beta:K_\alpha]=[L_\beta:L_\alpha][L_\alpha:K_\alpha]$, hence $[L_\beta: K_\beta L_\alpha]=\frac{[L_\beta:K_\beta]}{[L_\alpha:K_\alpha]}$. This yields \begin{equation*} N_{L_\beta\mid L_\alpha}(y_{L_\beta})=\frac{[L:K]}{[L_\alpha:K_\alpha]}x_{K_\alpha}=y_{L_\alpha}. \end{equation*}