Bowen and Wang's Introduction to Vectors and Tensors I (pg. 168) states a general form of the polar decomposition theorem as
Every automorphism A has two unique multiplicative decompositions $$ A = R U \quad \text{and} \quad A = V R $$ where $R$ is unitary and $U, V$ are Hermitian and positive definite.
So clearly it is sufficient for $A$ to be an automorphism to allow a polar decomposition. But is this also a necessary condition?
If not, a counterexample would be welcome.
Thanks.
It always exists (at least, for finite-dimensional spaces), after weakening positive-definite to positive-semidefinite, regardless of whether $A$ is invertible; it can be constructed from the singular value decomposition, for example. The decomposition is not unique if $A$ is singular, though; take the zero matrix for a trivial example.