Have I correctly implemented the line search method using gradient descent?

131 Views Asked by At

I'm refreshing my memory regarding the line search optimization method on convex functions using gradient descent.

Running through the nuts and bolts of some manual calculations, let's say we have a simple paraboloid function $f(x,y)=\frac{(x-1)^2}{2}+\frac{(y-2)^2}{3}$. Starting at an arbitrary initial point $\vec x_0=\langle x_0,y_0 \rangle$, we'll descend in the direction of the steepest descent vector $\vec p_0=\nabla f(x_0,y_0)$ constrained to follow the surface of the function$f(x,y)$ along the parametric curve $h(\alpha)=f(\vec x_0+\alpha\vec p_0)$ until we hit the minimal value.

For $i=0$ through $M$ iterations:

Step 1. For iteration $i=0$, set initial point $\vec x_0=\langle 0,3 \rangle$.

Step 2. Find gradient at $(0,3)$: $\vec p_0=-\nabla f(0,3)=-\langle f_x(0,3),f_y(0,3)\rangle=\langle 1,-\frac{2}{3}\rangle$.

Step 3. Find the parametric form of the constrained curve being minimized: $h(\alpha)=f(\vec x_0+\alpha\vec p_0)=f(\langle 0,3 \rangle+\alpha\langle 1,-\frac{2}{3}\rangle)=f(\alpha,3-\frac{2\alpha}{3})=\frac{(\alpha-1)^2}{2}+\frac{(1-\frac{2\alpha}{3})^2}{3}$.

Step 4. Find minimum of $h(\alpha)$ by solving for value of $\alpha_{min}$ where $\frac{\delta h(\alpha)}{\delta \alpha}=0$:

$\frac{\delta h(\alpha)}{\delta \alpha}=\alpha-1-\frac{4}{9}+\frac{8\alpha}{27}=0 \rightarrow \alpha_{min}=\frac{39}{35}$.

Therefore the minimum is $(x_{min},y_{min})=(\alpha_{min},3-\frac{2\alpha_{min}}{3})=(\frac{39}{35},3-\frac{2}{3}(\frac{39}{35}))=(\frac{39}{35},\frac{79}{35})$.

Step 5. Repeat Steps 1. through 4. for iteration $i=1$ using initial value $(\frac{39}{35},\frac{79}{35})$.

Have I correctly implemented the line search method here? The first minimum value calculated in Step. 4 doesn't seem right - shouldn't $x_{min}=1$ given the global minimum of $f(x,y)$ is at $(1,2)$?