In QR and LU factorizations what would the results be with transposed inputs?

795 Views Asked by At

I really wish column-major matrix order was never invented. It very quickly stops making sense after two dimensions and now I have to deal with it when interfacing with the cuSolver functions.

The Nvidia Cuda cuBlas and cuSolver library functions all accept matrices in column major order. I've been very successful up to now in adapting them for row-major matrices by moving arguments passed into them around, but with the factorization functions I am out of luck and will have to do an explicit transpose somewhere probably.

I am not at all familiar with how LU and QR decomposition work, so I am not sure whether I just need to transpose the input or whether I will also need to do the output.

1

There are 1 best solutions below

1
On BEST ANSWER

For any two matrices,

$$(AB)^T=B^TA^T$$

so that if you transpose the input matrix, you need to exchange and "countertranspose" the factors as well.

This works for $LU$, as $L^T$ (resp. $U^T$) is upper (lower) triangular, but not for $QR$ because $Q^T$ is orthogonal, while $R^T$ is lower triangular. So it gives you an $RQ$ decomposition, which does not necessarily coincide with $QR$.