Pivot Matrix for Row Permutations in LU decomposition (C++/CUDA)

833 Views Asked by At

I'm trying to implement some determinant routines for some CUDA C++ code that I'm writing. The only issue is, my code is returning nan's and inf's! It turns out that it's my pivoting routine that's bad.

Right now, my overall approach has been ripped straight from rosetta code's C implementation (without those awful macros). But it seems like their pivoting routine isn't robust enough for my needs.

Right now, my pivoting routine will take a column, start at the row corresponding to the column index and then search for the column's largest element below the current row and only swap if a larger value is found.

This is far too naive in practice. Is there a more robust pivoting algorithm I can/should be using?

Here's a link to the source code test: https://github.com/LeonineKing1199/cuda-stuff/blob/master/tests/matrix-tests.cu#L182

If you look at that matrix, that's the properly permuted one. In my actual data, the rows of the matrix can be in any order! I just needed a static test to make sure it would eventually work. So, what kind of pivoting algorithm would I need to help me produce that same matrix assuming the rows were permuted in any order?

Sorry if I haven't given enough information. My main matrix class is here: https://github.com/LeonineKing1199/cuda-stuff/blob/master/include/math/matrix.hpp