Exact line search - Boyd Convex Optimization exercise - How do I derive this?

348 Views Asked by At

I am solving a question from Stephen Boyd's Convex optimization: to minimize the function $f(x) = \dfrac{1}{2}(x_1^2 + \gamma x_2^2) $ using exact line search.

enter image description here

However, I am not able to arrive at the expression in the given in the textbook. I looked through the solutions and could not understand how to arrive at $x^{k} - t \triangledown f(x^{(k)})$ The following is my solution.

1) From $f(x)$, i compute the partial derivatives to arrive at $ \triangledown f(x) = [x_1,\gamma x_2] $.

2) $x^{k} - t \triangledown f(x^{(k)})$ is then $[(1-t)x_1, (1-t\gamma) x_2]$

You can see that my solution is the same as the one below in red. What I don't get is how to arrive at green.

enter image description here