Projected Gradient Optimization Question

42 Views Asked by At

I was wondering if someone could give me a hint as to where to begin with solving part b)? I have shown in part a) that the update equation is $\frac{(I - \alpha Q)x_k}{||I - \alpha Q)x_k||}$ but I am not sure where to begin with part b). Would really appreciate a hint (not the solution).

enter image description here

EDIT: This is he progress I have made for b).

Use that $Qx_k = \lambda x_k$ and substitute this into the solution for a) so that we have $\frac{(I - \alpha \lambda)x_k}{||I - \alpha \lambda)x_k||}$ where if $\alpha$ is $\frac{1}{\lambda_{max}}$ we have that the identity matrix becomes a zero matrix. Given that the function is convex which is because $Q$ is Symmetric Positive Definite, then for a valid descent direction the function will converge to a local optima which is also the global optima.

I am not sure how to use the information about $x_0$ not being orthogonal to the $\lambda_{min}$