Being an electrical engineer, I have mostly worked in continous domains with $\mathbb R, \mathbb C$ as fields for elements of functions as well as matrices. Now this question relates to operations in $\mathbb Z_2$
Consider the matrix
$$A = \begin{bmatrix}1&1&1\\1&1&0\\0&1&1\end{bmatrix} \text{ it has inverse } A^{-1} = \begin{bmatrix}1&0&-1\\-1&1&1\\1&-1&0\end{bmatrix}$$
or over $\mathbb Z_2$:
$$A^{-1} = \begin{bmatrix}1&0&1\\1&1&1\\1&1&0\end{bmatrix}$$
If we multiply together using normal matrix multiplication :
$$AA^{-1} = \begin{bmatrix}3&2&2\\2&1&2\\2&2&1\end{bmatrix} = \begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}(\text{mod } 2) = I$$
Which seems to work.
So if I have three bit number $x_1x_2x_3$, stuff it into a vector and then letting $\oplus$ denote addition modulo 0 (or in logics and electronics language "xor") do $$\cases{{\hat x_1}=x_1\oplus x_2 \oplus x_3\\{\hat x_2} = x_1 \oplus x_2\\{\hat x_3} = x_2\oplus x_3}$$
Followed by
$$\cases{{\tilde x_1}={\hat x_1} \oplus {\hat x_3}\\{\tilde x_2} = {\hat x_1} \oplus {\hat x_2}\oplus {\hat x_3}\\{\tilde x_3} = {\hat x_1}\oplus {\hat x_2}}$$
Then we can rest assured thanks to our algebra that we will have:
$$\cases{{\tilde x_1} = x_1\\{\tilde x_2} = x_2\\ {\tilde x_3} = x_3}$$
Is this interpretation correct? Will this approach work in general, just "converting" element-wise inverses we got from doing inverse over a bigger field like we did above.