Suppose I have a system of linear equations $X\beta = y$, with matrix $X_{n,d}$ and vector $y_n$, where $n>d$. In other words, an overdeterimined system of equations. The OLS solution is:
$\beta = (X^T X)^{-1}X^Ty$
which is the same solution found solving the system of equations via SVD:
$X\beta = y$
$U\Sigma V^T\beta = y$
$\beta = V\Sigma^{-1}U^T y$
Mi question is, if I use an iterative method to solve for $\beta$, for example, the conjugate gradient method, am I going to approximate this solution or a different one with different properties?
I believe you intend to use the conjugate gradient method to solve the linear system $A\beta = X^T y$, where $A=X^TX$ is symmetric and positive semi-definite. According to the post below, the conjugate gradient method in this case is supposed to converge to the minimum norm solution.
https://scicomp.stackexchange.com/questions/35239/what-happens-when-i-use-a-conjugate-gradient-solver-with-a-symmetric-positive-se