As far as i know LU decomposition allow you to calculate matrix determinant following easy and cheap formula: Det[A] = Det[L] Det[U] = Det[U]
Trying this out in Mathematica 7 gives me correct result by absolute value, i.e. it ignores negative determinants and transforms them to positive ones.
Sample code:
matrixA = {{-1, 2, 3, 5}, {-7, -4, 5, 4}, {-89, 7, 8, -6}, {8, 6, -1, 4}};
Det[matrixA] gives out -2067
but
{lu, p, c} = LUDecomposition[matrixA]
u = lu SparseArray[{i_, j_} /; j >= i -> 1, {4, 4}]
Det[u] is 2067
Well, the question is obvious - how to get correct determinant in Mathematica using LU decomposition?
Well, this is because you forgot to take into account the permutation matrix that is output by the LU decomposition routine. You have to remember two facts: 1.) Gaussian elimination generally performs row interchanges for numerical stability reasons, and 2.) the determinant of a matrix changes sign if you interchange rows.
In Mathematica at least, the function that will rescue you is
Signature[], which gives the signature of the permutation required to turn the permutation matrix (which Mathematica outputs as a scrambled list of the numbers from 1 to n, where n is the size of your matrix) into the identity matrix.So, to use your example in Mathematica:
we compare
with the following:
Now,
LUDecomposition[]here outputs three things: the merged $\mathbf{L}$ and $\mathbf{U}$ matrices, the permutation, and the condition number. We can get the tentative determinant by multiplying together the diagonal elements oflu, thus:Here is where
Signaturecomes in:Note that to turn
{1, 2, 4, 3}into{1, 2, 3, 4}, one needs one swap; namely, to swap the third and fourth elements. Since 1 is odd, the signature is -1.Thus,
gives the correct answer.