I know that if we have the problem of least squares for a system $Ax=y$ there are always solutions.
(I'm interested in the case in which the Hessian matrix $A^TA$ is not positive-definite)
However, are these solutions minimum, or can they be maximum or saddle points? Is there always an absolute minimum? How this can be proved?
The unconstrained ordinary least squares problem is a convex optimization problem. Therefore, every local minimum is a global minimum, and there are no saddle points or local maxima. This is true whether or not $A^TA$ is positive definite or is singular; it is always positive semi-definite, thereby giving arise to a convex quadratic objective function.
There will be infinitely many solutions to the ordinary least squares problems if $A^TA$ is singular. The minimum 2-norm (of solution vector) is obtained using the pseudo-inverse, pinv, of A, i.e., $x = pinv(A) y$. The other solutions may (or may not) have greater 2-norm, but they all have the same objective value.