Determinant of a special $4\times 4$ matrix

195 Views Asked by At

Let $f(x)=\sum_{k=1}^{4}a_{k}x^{k},\varepsilon =\cos\frac{\pi}{2}+i\sin\frac{\pi}{2}.$

$\qquad\qquad 4\times 4$ matrix $$T=\begin{bmatrix} 1& a_{2}& a_{3}& a_{4}\\ 1& a_{1}& a_{2}& a_{3}\\ 1& a_{4}& a_{1}& a_{2}\\ 0& \varepsilon^{2}& \varepsilon& 1\end{bmatrix}$$

Show that $$\det(T)=f(\varepsilon^{2})f(\varepsilon^{3})$$


Further I can generalize this question :

Let $f(x)=\sum_{k=1}^{n}a_{k}x^{k},\varepsilon =\cos\frac{2\pi}{n}+i\sin\frac{2\pi}{n}.$

$\qquad\qquad n\times n$ matrix $$T=\begin{bmatrix} 1& a_{2}& a_{3} & \cdots & a_{n}\\ 1& a_{1}& a_{2}&\cdots & a_{n-1}\\ \cdots& \cdots& \cdots&\cdots & \cdots\\ 1& a_{4}& a_{5}& \cdots &a_{2}\\ 0& \varepsilon^{n-2}& \varepsilon^{n-3}& \cdots& 1\end{bmatrix}$$

Show that $$\det(T)=f(\varepsilon^{2})f(\varepsilon^{3}) \cdots f(\varepsilon^{n-1})$$


Let $$A=\begin{bmatrix} 1& 0& a_{3}& a_{4}& a_{1}\\ 0& 1& a_{2}& a_{3}& a_{4}\\ 0& 1& a_{1}& a_{2}& a_{3}\\ 0& 1& a_{4}& a_{1}& a_{2}\\ 0& 0& \varepsilon^{2}& \varepsilon& 1\end{bmatrix}$$ then $\det(T)=\det(A)$. Now add $\varepsilon^{2}$ of row 4 to row 1, add $\varepsilon^{4}$ of row 3 to row 1, add $\varepsilon^{6}$ of row 2 to row 1, we get $$A=\begin{bmatrix} 1& 0& a_{3}& a_{4}& a_{1}\\ 0& 1& a_{2}& a_{3}& a_{4}\\ 0& 1& a_{1}& a_{2}& a_{3}\\ 0& 1& a_{4}& a_{1}& a_{2}\\ 0& 0& \varepsilon^{2}& \varepsilon& 1\end{bmatrix}\longrightarrow \begin{bmatrix} 1& 0& \varepsilon^{4}f(\varepsilon^{2})& \varepsilon^{2}f(\varepsilon^{2})& f(\varepsilon^{2})\\ 0& 1& a_{2}& a_{3}& a_{4}\\ 0& 1& a_{1}& a_{2}& a_{3}\\ 0& 1& a_{4}& a_{1}& a_{2}\\ 0& 0& \varepsilon^{2}& \varepsilon& 1\end{bmatrix}=A_{1}$$

How can I separate $f(\varepsilon^{2})$ from $\det(A_{1})$?

If you have another proof to my question,please give me some hints. Any help would be appreciated

1

There are 1 best solutions below

0
On

As we're looking for the determinant we may as well think about the transpose

$$ T^T=\begin{bmatrix} 1& 1& 1& \cdots& 1& 0\\ a_{2}& a_{1}& a_{n}& \cdots& a_{4}& \varepsilon^{n-2}\\ a_{3}& a_{2}& a_{1}& \cdots& a_{5}& \varepsilon^{n-3}\\ \vdots& \vdots& \vdots& \ddots& \vdots& \vdots\\ a_{n}& a_{n-1}& a_{n-2}& \cdots& a_{2}& 1 \end{bmatrix} $$

Change the last column of $T^T$ to define $C$ as follows

$$ C=\begin{bmatrix} 1& 1& 1& \cdots& 1& 1\\ a_{2}& a_{1}& a_{n}& \cdots& a_{4}& a_{3}\\ a_{3}& a_{2}& a_{1}& \cdots& a_{5}& a_{4}\\ \vdots& \vdots& \vdots& \ddots& \vdots& \vdots\\ a_{n}& a_{n-1}& a_{n-2}& \cdots& a_{2}& a_{1} \end{bmatrix} $$

Now take a look at the system of linear equations in $x_j$

$$ Cx=\begin{bmatrix} 1& 1& 1& \cdots& 1& 1\\ a_{2}& a_{1}& a_{n}& \cdots& a_{4}& a_{3}\\ a_{3}& a_{2}& a_{1}& \cdots& a_{5}& a_{4}\\ \vdots& \vdots& \vdots& \ddots& \vdots& \vdots\\ a_{n}& a_{n-1}& a_{n-2}& \cdots& a_{2}& a_{1} \end{bmatrix} \begin{bmatrix} x_1\\ x_2\\ x_3\\ \vdots\\ x_n \end{bmatrix}= \begin{bmatrix} 0\\ \varepsilon^{n-2}\\ \varepsilon^{n-3}\\ \vdots\\ 1 \end{bmatrix}:=b $$

By Cramer's rule $$ \color{red}{x_n\det(C)=\det(T^T)=\det(T)} $$

For $y_j=\varepsilon^{-j}$ we have $(Cy)_1=0$ and for $k>1$ $$ (Cy)_k=(a_k\varepsilon^{-1}+a_{k-1}\varepsilon^{-2}+\cdots+a_1\varepsilon^{-k})+(a_n\varepsilon^{-k-1}+a_{n-1}\varepsilon^{-k-2}+\cdots+a_{k+1}\varepsilon^{-n})\\ =\varepsilon^{-k-1}f(\varepsilon)=\varepsilon^{n-k}{f(\varepsilon)\over\varepsilon}=b_k{f(\varepsilon)\over\varepsilon}\\ \therefore C\left({\varepsilon\over f(\varepsilon)}y\right)=b $$ So we can choose ( in case $C$ is invertible we have only one choice ) $$ \color{red}{x_n={\varepsilon\over f(\varepsilon)}} $$ Now the row operation $R_1\sum_{k=1}^na_k-\sum_{k=2}^nR_k$ on $C$ gives us $$ \left(\sum_{k=1}^na_k\right)C=f(1)C=\begin{bmatrix} a_{1}& a_{n}& a_{n-1}& \cdots& a_{3}& a_{2}\\ a_{2}& a_{1}& a_{n}& \cdots& a_{4}& a_{3}\\ a_{3}& a_{2}& a_{1}& \cdots& a_{5}& a_{4}\\ \vdots& \vdots& \vdots& \ddots& \vdots& \vdots\\ a_{n}& a_{n-1}& a_{n-2}& \cdots& a_{2}& a_{1} \end{bmatrix} $$ The matrix $f(1)C$ is the circulant matrix whose associated polynomial is $f(x)/x$ and therefore its determinant is $$ \prod_{j=1}^nf(\varepsilon^j)\varepsilon^{-j}=\begin{cases}\prod_{j=1}^nf(\varepsilon^j) & n\text{ is odd }\\ \varepsilon^{-{n\over2}}\prod_{j=1}^nf(\varepsilon^j) & n\text{ is even } \end{cases} $$ So assuming $f(1),f(\varepsilon)\neq0$, $$ \color{blue}{\det(T)=\begin{cases} \varepsilon\prod_{j=2}^{n-1}f(\varepsilon^j) & n\text{ is odd }\\ \varepsilon^{1-{n\over2}}\prod_{j=2}^{n-1}f(\varepsilon^j) & n\text{ is even } \end{cases}} $$ Edit: As the original problem asks to prove a different result, I checked $\det(T)$ for $2\le n\le10$ and the formula works. Easiest way to prove that $f(\varepsilon^j)\varepsilon^{-j}$ are eigenvalues of $f(1)C$ is to realize $f(1)C=g(P)$ where $g(x)=f(x)/x,\;P$ is the permutation matrix defined as $$ P_{ij}=\delta_{i(j+1\mod n)} $$ and the eigenvalues of $P$ are $n$-th roots of unity. This proof clearly works for any circulant matrix.