Consider a method where the first iteration is a steepest descent iteration with exact line search and subsequent iterations have form $$x_{i+1} = x_i - \alpha_i f^{'}(x_i) - \beta_i(x_i - x_{i-1})$$ where $\alpha_i$ and $\beta_i$ are obtained by two dimensional minimization (that is they minimize value of $f$ on corresponding plane).
Show that if $f$ is a quadratic function then this method is the same as conjugate gradient method.