We are given a matrix $$M=\begin{pmatrix}0&1&1\\1&1&0\\1&1&1\end{pmatrix}$$
- I need to show that $M$ represents multiplication by element $\beta $ in the field $F = {F_2}[x]/\left\langle {{x^3} + x + 1} \right\rangle $, where $\beta (x) = {x^2} + x$
- I need to find the inverse of ${x^2} + x{\text{ in}}\,F$
- How can i calculate the ${A^{ - 1}}$ applying the multiplication by $\beta {(x)^{ - 1}}\,{\text{in}}\,F$?
I tried to solve part 1. by multiplying the matrix by the binary vector of ${x^2} + x = \begin{pmatrix}0\\1\\1\end{pmatrix}$. As a result i got $x \in F$ polynomial. Not sure if that is correct and enaugh to solve part 1.
Got stuck on part 2. I know that i have to use the EEA but looks like i cannot get it work for this case. What i do is declare $f = {x^3} + x + 1,\,\,\,g = {x^2} + x$. I need to find a polynomial $h$ such that $gh \equiv 1\,(\bmod \,f)$, or equivalently $gh + kf = 1$ for some $k \in {F_2}[x]$. The Euclidean algorithm can be used to find $h$ and $k$:
$$\eqalign{ & f = g \cdot (x + 1) + 1 \cr & g = 1 \cdot ({x^2} + x) + 0 \cr} $$
Now if i work backward i get:
$$1 = f \cdot 1 - g \cdot (x + 1) = f \cdot 1 + g \cdot (x + 1)$$
But $(x + 1)$ is not an inverse of ${x^2} + x$. So what am i doing wrong?
Finally in the last part i know that the matrix inverse in the field can be calculated using ${M^{ - 1}} = \frac{1}{{{\text{Det}}(M)}}{\text{Adj}}(M)$ formula. But first, i don't know how to do it considering the field factor. And second the task hints that it can be done through a multiplication by ${({x^2} + x)^{ - 1}}$ in the field. I have no idea how to do it...Any help, suggestions and feedback would be welcome!
1) No, what you're doing in multiplying $M$ by that vector is pointless and incorrect. You're just taking the square of $(x^2+x)$.
I suspect the problem also tells you the basis with respect to which $M$ is multiplication by $x^2+x$. It looks like it's $\{1,x,x^2\}$.
$F$ is a vector space, it has a basis $\alpha =\{1,x,x^2\}$. Multiplication by $x^2+x$ is a linear transformation, let's call it $T$. The columns of the matrix of $T$ (with respect to $\alpha$) are coordinates of vectors you get by applying $T$ to each basis vector. So for example the middle basis vector is $x$, you apply $T$ to it, i.e. you multiply it by $x^2+x$, and the result is $x^3+x^2$. Now in $F$ we have $x^3 + x+ 1=0$, so $x^3 + x^2 = x^2 - x - 1 = x^2 + x + 1$. The coordinates of this vector with respect to the basis $\alpha$ are $[\begin{array}{l}1 & 1& 1\end{array}]^T$. Now that's exactly the middle column of $M$. You repeat this procedure to get the other two columns.
The problem is a routine calculation of the matrix of a linear transformation with respect to a basis.
2) $1+x$ is definitely the inverse of $x^2+x$:
$$ (1+x)(x^2+x) = x^2 + x + x^3 + x^2 = x^3 + x = 1,$$
3) What is $A$? Do you mean $M$? You know the transformation $T$ that was multiplication by $x^2+x$ has an inverse: multiplication by $1+x$. So the inverse of the matrix of $T$, i.e. $M$, is the inverse of the matrix of multiplication by $1+x$ (with respect to the basis $\alpha$). Now you do exactly what you did to calculate $M$ in part 1), except for $1+x$ instead of $x^2+x$.