Should the svd of a square matrix always compute a U and V which are inverses?

63 Views Asked by At

I’m using a Scala library which (I believe) wraps a java library which does some linear algebra computations. There is a function which is documented to return THE singular value decomposition. First of all I was under the impression that the SVD was not unique. But ignoring that question for the moment.
The svd function in this library accepts a rectangular matrix, but I am always using a square matrix. It returns (U,s,V) where U and V are nxn square matrices and s is a vector (1xn). However, in some cases UV does not equal the identity.

Does this seem suspicious?

Sometimes the U and V given back are negative inverses of each other. I.e., UV = -I.

Does that sound like a bug in the svd function? Or does UV=I only if the matrix is positive definite?

For example, if I test it with a 1x1 matrix M=[-4.034101137641814], a number I obtained with a random number generator then svd returns ([-1.0], [4.034101137641814],[1.0]) rather than ([1.0], [-4.034101137641814],[1.0]) as I would expect.

Does it sound to you like a bug, or is my understanding wrong?