For strictly convex quadratic functions in $\mathbb{R}^n \rightarrow \mathbb{R}$ we have Newton's method that minimizes them in a single iteration and conjugate gradient-like methods minimize them in $n$ iterations.
However, I did not find any similar results for general convex functions. It seems that common unconstrained optimization tools reduce such functions to a value that is essentially equal to the solution in finite precision computation. But on infinite precision such methods are only guaranteed to converge towards the solution infinitely.
Is there any method that is known to give an infinite-precision answer in finite time on general strictly convex functions? Is it known to be unsolvable in finite time?