Actually I am working with a overdetermined homogeneous system of linear equations $Ax=0$. In literature I found, that a least square result can be found with helpf of SVD. The solution is the eigenvector corresponding to the smallest eigenvalue of $A^TA$.
So far so good. But in some literature I could read, that one should look for the eigenvalue/eigenvector of $A$. Unfortunately I cannot find the literature, which says $A$ instead of $A^TA$.
Does the method change, if the system is not overdetermined?
Please help me to solve my confusion. Thanks.
If the system is not over-determined then there should be a solution, so you should try to solve the original problem,
$$A x = 0.$$
More generally, it sounds like you want to do something like minimizing the norm of the vector, $A x$. But of course you'd just choose $x=0$. So do you want to instead minimize the norm of $A x$ for a fixed magnitude of $x$?. If so, the way I'd do it is to minimize $(A x)^2 = x^T A^T A x$ subject to the condition $x^T x = 1$. I'd do that by adding the Lagrange multiplier term and enforcing the first-order conditions on:
$$x^T A^T A x + \lambda (x^T x - 1).$$
That is, I'd differentiate it with respect to the vector $x$ and set it equal to zero, yielding:
$$A^T A x + \lambda x = 0.$$
Consequently, indeed, if this is what you want then your answer should be an eigenvector of $A^T A$, specifically any one with the lowest eigenvalue. I think similar analysis applies to the complex case where the transpose is replaced with the conjugate transpose.