Back substitution and QR Factorisation via Householder

247 Views Asked by At

I'm having difficulties getting the same beta values from this OLS regression as I would without the QR decomposition. I believe it has to do with the shape of my Q block. I start as so:

A =\begin{pmatrix} 1 & 1 & 1 \\ 1 & 2 & 8 \\ 1 & 3 & 37 \\ 1 & 4 & 64 \\ 1 & 5 & 125 \\ \end{pmatrix}

b = \begin{pmatrix}1\\2\\2\\2\\3\end{pmatrix}

This gives me an R of:

R = \begin{pmatrix}-2.236&-6.708&-105.095\\0&3.16228&96.1332\\0&0&29.4686\\0&0&0\\0&0&0\end{pmatrix}

And a Q of:

Q = \begin{pmatrix}-0.447214&-0.632456&0.50223&0.112111&-0.36769166\\-0.447214&-0.316228&-0.29184&0.213736&0.75441948\\-0.447214&-1.39e{-17}&-0.33934&-0.81059&-0.16672608\\-0.447214&0.3162278&-0.45472&0.531518&-0.45903965\\-0.447214&0.6324555&0.58367&-0.04678&0.2390379\end{pmatrix}

It is my understanding that to get the beta values (x) you need to use back substitution. The thing is, with a square Q, the last two values from $Q^{T}$b values that are not used.

Rx = $Q^{T}$b

\begin{pmatrix}-2.236&-6.708&-105.095\\0&3.16228&96.1332\\0&0&29.4686\\0&0&0\\0&0&0\end{pmatrix} * \begin{pmatrix}X\\X\\X\end{pmatrix} = \begin{pmatrix}-4.472136\\1.26491106\\0.08144254\\-0.158891\\0.60672956\end{pmatrix}

By my substitution I am getting betas (X) of

\begin{pmatrix}0.9222\\0.3160\\0.0028\end{pmatrix}

When I should be getting:

\begin{pmatrix}0.8966\\0.3365\\0.0021\end{pmatrix}

If I do the standard OLS equation I get the correct betas without a problem. It must be something with my factorisation?

I'd greatly appreciate any help. I think the problem is the square nature of my Q matrix, but that is how following the guides below result.

I based my process on this stackexchange page: How can I compute solution of a non-square matrix by QR Decomposition and Cholesky Factorization

And this Lecture: http://www.math.usm.edu/lambers/mat610/sum10/lecture9.pdf