I am working my way through this intro on conjugate gradient and I've come to a place I could use some help with understanding.
In Section 7.3 on Optimality of the Error Term, an example is given where we have two search directions $d_0$ and $d_1$ which are vectors that form subspace $D_2 = \text{Span}(d_0, d_1)$. Since we have only two vectors that means $D_2$ forms a plane which is reflected in Figure 26.
Next, we have some initial error vector $e_0$ which is shown as a vector that starts at $(0,0)$ which lies above $D_2$ and ends at a point in $D_2$ (also in Figure 26).
The author then says the new error vector $e_i$ is chosen from $e_0 + D_2$. This is where I am confused. What is a vector plus a subspace? Is this just $\text{Span}(e_0, d_0, d_1)$? Does it mean that the space of vectors from which we can choose is now defined by any vector lying in $D_2$ as well as any vector starting at $(0,0)$ and ending in $D_2$?