I am going through the Elements of Statistical Learning and am currently working through a chapter on using splines in regression. I have a question about deriving the basis functions for the cubic spline. Imagine that we are trying to regress some data with 2 knots at $\epsilon_1$ and $\epsilon_2$. The book says that the appropriate set of basis functions is the following:
$$h_1(X) = 1$$ $$h_2(X) = X$$ $$h_3(X) = X^2$$ $$h_4(X) = X^3$$ $$h_5(X) = (X - \epsilon_1)^3$$ $$h_6(X) = (X - \epsilon_2)^3$$
from which we then form the least squares estimator
$$\hat{f}(X) = \sum_{j=1}^6 \beta_jh_j(X)$$
Why do we need to add $h_5(X)$ and $h_6(X)$? If we required that the first and second derivatives be continuous and equal at the knots then isn't it sufficient to only require up to $h_4(X)$? I have been trying to understand why these are added into the regression as they just seem like extra terms. In this example we do not impose natural or clamped boundary conditions, so I thought that might have something to do with it.