I am asked to solve the following problem:
Consider k-steps of Richardson's method with different parameters $\omega_1, . . ., \omega_k $. The error equation is then: $$ e_k = (I - \omega_kA)...(I - \omega_1A)e_0 $$
Consider the optimization problem of choosing k-parameters: $$ min_{\omega_i \in \Bbb R, i=1,...,k} \,\, [max_{\lambda \in[\lambda_{min}(A),\lambda_{max}(A)]} \,\,|(I-\omega_k\lambda)...(I-\omega_1\lambda)|\,] $$
Find the solution and derive the rate. This trick is known as the Chebyshev Acceleration.
Some things I know regarding the eigenvalues of the Richardson method:
$$ \lambda_{min}(A) \gt 0 $$ $$ 0 \lt \omega\lambda_{max}(A) \lt 2 $$ $$ \lambda_{min}(A) \le \lambda_{max}(A) $$ $$ 0 \lt \lambda_{min}(A) \le \lambda \le \lambda_{max}(A) $$
I also know the optimal value of $ \omega $ for the Richardson's method as: $$ \omega^* = \frac{2}{\lambda_{max}(A) + \lambda_{min}(A)} $$
I am not sure how to proceed with this problem. I could look at the problem in the simplest form (k=2) that way I only have two terms. Could I treat $ \omega $ as essentially being constant in order to find the eigenvalue(s) which would lead to the highest absolute eigenvalue error? Then minimize the overall error with a choice of $ \omega $?