Is there a general way to get the QR decomposition of a companion matrix?

651 Views Asked by At

Is there a general way to get the QR decomposition of a companion matrix? Is it considered a sparse matrix? Is shifting always required in this case?

1

There are 1 best solutions below

6
On

Here's a 2007 paper by Chandrasekaran et al on a tailored QR decomposition for companion matrices:

A Fast QR Algorithm for Companion Matrices

The discussion there is about repeated QR iterates converging to a Schur form. A single QR decomposition by itself is done there using what the authors call "sequentially semi-separable" (SSS) forms.


There are several notions of "companion matrix", each realizing the property of having a specified characteristic polynomial. The version referred to in the comments below (Wikipedia article) is called the Frobenius companion matrix:

$$ C = \begin{bmatrix} 0 & 0 & \ldots & 0 & -c_0 \\ 1 & 0 & \ldots & 0 & -c_1 \\ 0 & 1 & \ldots & 0 & -c_2 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & \ldots & 1 & -c_{n-1} \end{bmatrix} $$

having characteristic polynomial $p(t) = t^n + c_{n-1} t^{n-1} + \ldots + c_1 t + c_0$.

With this convention it is trivial to express $C$, which is upper Hessenberg, as the product $QR$ of an orthogonal matrix $Q$ and an upper triangular matrix $R$:

$$ C = \begin{bmatrix} 0 & 0 & \ldots & 0 & 1 \\ 1 & 0 & \ldots & 0 & 0 \\ 0 & 1 & \ldots & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & \ldots & 1 & 0 \end{bmatrix} \; \begin{bmatrix} 1 & 0 & \ldots & 0 & -c_1 \\ 0 & 1 & \ldots & 0 & -c_2 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & \ldots & 1 & -c_{n-1} \\ 0 & 0 & \ldots & 0 & -c_0 \end{bmatrix} $$

It seems doubtful that this would be useful, but the $QR$ decomposition of a general matrix can be used for solving systems of equations "by elimination" in much the way that row-echelon form of an augmented matrix can be used. The basic idea is to replace elementary row operations with orthogonal transformations (typically Givens rotations or Householder transformations acting on the rows of the matrix).

A less trivial $QR$ factorization would address a different form of companion matrix (also upper Hessenberg) used in the paper linked at top:

$$ PC^TP^T = \begin{bmatrix} -c_{n-1} & -c_{n-2} & \ldots & -c_1 & -c_0 \\ 1 & 0 & \ldots & 0 & 0 \\ 0 & 1 & \ldots & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & \ldots & 1 & 0 \end{bmatrix} $$

where $P$ is an orthogonal matrix that reverses the ordering of rows:

$$ P = \begin{bmatrix} 0 & 0 & \ldots & 0 & 1 \\ 0 & 0 & \ldots & 1 & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 1 & \ldots & 0 & 0 \\ 1 & 0 & \ldots & 0 & 0 \end{bmatrix} $$