Suppose I'm doing polynomial regression of degree $m$
$$p(x, \mathbf{w}) = w_0 + w_1x + \dotsb + w_mx^m$$
given training data $(x_1, t_1), \dotsc, (x_N, t_N)$. Suppose I'm using the loss function
$$L(\mathbf{w}) = \frac12 \sum_{j=1}^N \big( p(x_j, \mathbf{w}) - t_j \big)^2$$
The Hessian of the loss function is $AA^T$, where $A_{ij} = {x_{(i)}}^{j-1}$. What can be said about the eigenvalues of the positive definite matrix $AA^T$, as we vary which $x_{i}$ we choose? Specifically its condition number?