Proving (non-)invertibility of (nearly) tri-diagonal matrix?

46 Views Asked by At

When discussing a transformation for cryptographic keys over on Crypto.SE I noticed that the transformation could be described using a matrix, so I wrote it up and ran some basic online-tools against it to confirm whether the function is a bijection or not and found (empirically) the following conjecture for which I'd like to ask for help in proving.

Let $\mathbb F_2^n$ be the $n$-dimensional vector space built on-top of $\mathbb F_2$, the field with the elements $0,1$ and the addition and multiplication modulo 2. Let $b\in\mathbb F_2$ (in the practical scenario the value of $b$ indicates whether we use a cyclical or a logical shift).

My conjecture is now that the following matrix is invertible iff $b=0$: \begin{pmatrix} 1&0&0&\cdots&0&b\\ 1&1&0&\cdots&0&0\\ 0&1&1&\cdots&0&0\\ \vdots&\vdots&\vdots&\ddots&\vdots&\vdots\\ 0&0&0&\cdots&1&1 \end{pmatrix}

Textually the matrix is the all-zero matrix with the top-right entry holding the value $b$, the diagonal being all-ones and each entry below the diagonal (if it exists) being $1$ as well.

My question is now, can this above conjecture be (easily?) proven and if so how?

I have empirically verified the above conjecture for $n=2,3,4$ but have no idea how to prove it in all generality.

2

There are 2 best solutions below

1
On BEST ANSWER

Following up on the determinant comment, note that by expanding through the first row, where I omit signs since you are in $\mathbb F_2$:

$$\left| \begin{matrix} 1&0&0&\cdots&0&b\\ 1&1&0&\cdots&0&0\\ 0&1&1&\cdots&0&0\\ \vdots&\vdots&\vdots&\ddots&\vdots&\vdots\\ 0&0&0&\cdots&1&1 \end{matrix} \right|= \left| \begin{matrix} 1&0&\cdots&0&0\\ 1&1&\cdots&0&0\\ \vdots&\vdots&\ddots&\vdots&\vdots\\ 0&0&\cdots&1&1 \end{matrix}\right|+b \left|\begin{matrix} 1&1&0&\cdots&0\\ 0&1&1&\cdots&0\\ \vdots&\vdots&\vdots&\ddots&\vdots\\ 0&0&0&\cdots&1 \end{matrix}\right| =1+b$$ since the two matrices to the right are upper and lower triangular. This means that your matrix has $0$ determinant if and only if $b=1$, as you wanted.

2
On

When $b=1 \; \; : \; \; $Over the rational or real field, there is an eigenvalue of $2$ with eigenvector all entries equal to $1.$ However, over the field of two elements, the same eigenvector has eigenvalue $0.$

if you subtract off the identity matrix, you have a "companion matrix" for $x^n - b. $ Now that I think of it, $x^n - b$ is also the minimal polynomial for this revised matrix; whatever $b,$ we have a single Jordan block.

If you stick with the field of two elements, when $b=1$ the only eigenvalue is $1,$ so when you add $I$ back in the only eigenvalue is $0$