QR decomposition in linear regression

184 Views Asked by At

I have an exercise where I am to use QR decomposition on a linear regression problem. In order for it to work, I understand it that the matrix A needs to be a square matrix. In linear regression this is not always the case, and to work around this problem I am tasked to use QR decomposition where A is replaced byB =Transpose(A) * A. I have been working in R and found this example.

I don't understand how the use of B affects the computations of the QR decomposition of A = QR.

Regards