I am trying to derive the Gradient descent step xk also given as excercise 9.6 in the book - Convex Optimisation - B and V . The G.D step is found for the following function . $$f(x) =\frac{(x_{1}^{2}+\gamma x_{2}^{2})}{2}$$
The solution for the problem is : $$x_{1}^{k} = \frac{\gamma(\gamma-1)^{k}}{(\gamma + 1)^{k}} , x_{2}^{k} = \frac{(-(\gamma-1))^{k}}{(\gamma + 1)^{k}}$$
Solution manual of the book says something like this : . For $K = 0$ , we have the starting points $x^{0} =\{\gamma,1\}$ . My first question is how did we chose the intial points to be $x^{0} =\{\gamma,1\}$ ? Further he writes : $$x^{k} -t\Delta(f(x^{k})) =$$ \begin{bmatrix}(1-t)x_{1}^{k}\\(1-\gamma t)x_{2}^{k}\end{bmatrix} . This is clear, but the solution further says that the above is equal to : $$ \frac{(\gamma -1)^{k} }{(\gamma +1)^{k}} \begin{bmatrix}(1-t)\gamma^{k}\\(-1)^{k} \end{bmatrix} $$
How did we get here ?