In the following proof I’m finding it hard to understand why $\pi (x)$ is a root of $p(x)$. When $p$ is evaluated with $\pi (x)$ it gives the ZERO COSET and not zero itself. So how is this a root?
There are similar questions on this site but none of them explain this exact part.
Let $F$ be a field and $p(x)$ an irreducible polynomial in $F[x]$. Then we can find a field extension $L$ of $F$ such that $p(x)$ has a root in $L$.
Proof: Clearly the ideal $<p(X)>$ generated by $p(x)$ is maximal in $F[x]$. So $F[x]/<p(x)>$ is a field. Let us denote it by $L$. So $L$ is a field extension of $F$ since we have $F \hookrightarrow F[x] \xrightarrow{\rm \pi} L$ in a natural way.
$p(\pi (x))=p(x + <p(x)>) = \sum_{i = 0}^n a_i(x + <p(x)>)^i = \sum_{i = 0}^n a_ix^i + <p(x)> = p(x) + <p(x)> = 0 + <p(x)>$
Since $\pi$ is a ring homomorphism from $F[x]$ to $L$, we have $$p\bigl(\pi(x)\bigr)=\pi\bigl(p(x)\bigr)=p(x)+\langle\,p(x)\,\rangle=\langle\,p(x)\,\rangle. $$