Do we need steepest descent methods, when minimizing quadratic functions?

908 Views Asked by At

I'm studying about nonlinear programming and steepest descent methods for quadratic multivariable functions. I have a question highlighted in the following picture:

enter image description here

My question is: If we can explicitly solve the minimizing point $\textbf{x}^*$ of $f$, then where do we still need steepest descent anymore? This confused me, because it seems that first we solve the minimizing point of the function...and then we use that point to find the minimizing point?...

Hope my question is clear, thank you for any help! =)

1

There are 1 best solutions below

0
On BEST ANSWER

In this passage, the goal is just to shed light on the behavior of the steepest descent method. If we really wanted to minimize this quadratic function, we would use a more efficient method -- perhaps a method from numerical linear algebra to solve the linear system.

Note that solving a large linear system can be expensive. One of the popular methods for symmetric positive definite systems, the conjugate gradient method, can be interpreted as minimizing an associated quadratic function by an iterative method.