I was given the following question as part of a homework assignment. Any help would be greatly appreciated!
The following image shows the steps of a preliminary version of the conjugate gradient algorithm and was taken from the book Numerical Optimization by Jorge Nocedal and Stephen J. Wright.
We know from a previous theorem that $p_{k+1} \in \text{span}\{r_0, A r_0, \dots, A^{k+1} r_0\}$, $A \in \mathbb{R}^{n \times n}$ is symmetric positive definite and $r_k \in \mathbb{R}^{n \times 1}$, therefore $$ p_{k+1} = \xi_0 r_0 + \xi_1 A r_0 + \dots + \xi_{k+1} A^{k+1} r_0 $$ Show that $\xi_{k+1} = (-1)^{k+1}(\alpha_0 \times \alpha_1 \times \dots \times \alpha_k)$ using the preliminary version of the conjugate gradient method (Algorithm 5.1).
I have tried to solve this problem by Mathematical induction on $k$ since the base step is easy to prove, however, I haven't figured out a way to succesfully complete the induction step. I have tried using the following results:
- $r_{k+1} = r_k + \alpha_{k} A r_{k}$
- $r_k^T p_i = 0$ for $i = 0, 1, \dots, k-1$
- $p_k^T A p_i = 0$ for $i = 0, 1, \dots, k-1$
- $r_k^T r_i = 0$ for $i = 0, 1, \dots, k-1$
Note: the instructions never stated this, but I assumed that $\xi_0 = -1$ since it's essentially a definition in Algorithm 5.1.

I managed to prove it (or at least something like it) using induction. Here's what we'll need:
Base Step
Let $k = 1$, then
$$p_k = p_1 = -r_1 + \beta_0 p_0 = -r_0 - \alpha_0 A p_0 + \beta_0 p_0$$
Since $p_0 = -r_0$, we get
$$p_1 = -r_0 + \alpha_0 A r_0 - \beta_0 r_0 = -(1 + \beta_0) r_0 + \alpha_0 Ar_0$$
Therefore, $p_k = p_1 \in \text{span}\{r_0, A r_0\} = \text{span}\{r_0, \dots, A^k r_0\}$.
Inductive step
Let $k$ be a natural number. Our induction hypothesis is as follows:
$$\xi_k = (-1)^{k+1} (\alpha_0 \times \alpha_1 \times \dots \times \alpha_{k-1})$$
We now show that this equation holds for $k+1$.
$$p_{k+1} = -r_{k+1} + \beta_k p_k = -r_k - \alpha_k Ap_k + \beta_k p_k$$
therefore,
$$-\alpha_k A p_k = r_k - r_{k+1}$$
Since $r_k \in \text{span}\{r_0, Ar_0, \dots, A^k r_0\}$, we have
$$-\alpha_k A p_k = r_k - r_{k+1} = \sum_{i=0}^k \xi_i A^i r_0 - \sum_{i=0}^{k +1} \xi_i A^i r_0 = -\xi_{k+1} A^{k+1} r_0$$
Multiply by $A^{-1}$ from the left
$$\alpha_k p_k = \xi_{k+1} A^k r_0 \Rightarrow \alpha_k p_k - \xi_{k+1} A^k r_0 = 0$$
Now, dividing by $\alpha_k$ and substituting $p_k$ for its expression as a linear combination, we have
$$p_k - \frac{\xi_{k+1}}{\alpha_k} A^k r_0 = \sum_{i = 0}^k \xi_i A^i r_0 - \frac{\xi_{k+1}}{\alpha_k} A^k r_0 = \sum_{i = 0}^{k-1} \xi_i A^i r_0 + (\xi_k - \frac{\xi_{k+1}}{\alpha_k}) A^k r_0 = 0$$
$\{r_0, A r_0, \dots, A^k r_0\}$ are linearly independent, therefore every coefficient in the last equation must be equal to zero.
$$\xi_k - \frac{\xi_{k+1}}{\alpha_k} = 0 \Longleftrightarrow \xi_{k+1} = \alpha_k \xi_k = (-1)^2 \alpha_k \xi_k$$
and, using our inductive hypothesis, we get
$$\xi_{k+1} = (-1)^2 \alpha_k \xi_k = (-1)^2 \alpha_k \Big((-1)^k (\alpha_0 \times \alpha_1 \times \dots \times \alpha_{k-1}) \Big) = (-1)^{k+2} (\alpha_0 \times \alpha_1 \times \dots \times \alpha_{k-1} \times \alpha_k)$$
Note: I strongly suspect that the exponent associated to $-1$ in the original question is incorrect and the general form of the coefficients is actually $\xi_{k+1} = (-1)^{k+2} (\alpha_0 \times \alpha_1 \times \dots \times \alpha_{k-1} \times \alpha_k)$.