Definitions / assumptions etc.
Let $(t_i)_{i \in \mathbb{N}}$ be a bounded sequence of positive pairwise distinct real numbers, $m \in \mathbb{N}$ fixed and let $d := m+1$. If it helps it may additionally be assumed that the sequence is dense in a compact interval.
Define the sequences $$ v_k := \pmatrix{1 & t_k & ... & t_k^m}^T \in \mathbb{R}^d \\ G_k := \sum_{i=1}^k v_iv_i^T \in \mathbb{R}^{d,d} $$ such that in particular $G_{k+1} = G_k + v_{k+1}v_{k+1}^T$. It may be assumed that $G_k$ is eventually (for large enough $k$) invertible [I think this is easily shown using the fundamental theorem of algebra] (for reference: $G_k$ is the gramian matrix $V_k^T V_k$ associated to the degree $d$ Vandermonde matrix $V_k$ on $t_1, ...,t_k$)
Goal
I want to show that $$\lim_{k \to \infty} \det G_k = \infty$$ as a partial result towards a (uniform) convergence proof of linear least-squares regression when using polynomial basis functions. Ideally I'd want to prove that it's asymptotically equal to $k^d$ (see below for why I think this is true) but that'd really be a bonus.
What I already have
Sequence is strictly increasing
Showing the recursion $\det G_{k+1} > \det G_k$ is simple - it follows immediately from the matrix determinant lemma $\det G_{k+1} = (1+v_{k+1}^T G_k^{-1} v_{k+1}) \det G_k$ because $G_k$ is symmetric positive definite.
All Eigenvalues are positive and have multiplicity 1
Since each $G_k$ is a product of two totally positive matrices the $G_k$ themselves are totally positive and have positive eigenvalues with multiplicity 1.
Trace diverges
It's easy to see that $\operatorname{Trace}(G_k) = \sum_{r=0}^m \sum_{i=1}^k t_i^{2r}$, which has to diverge since the $r=0$ term is just $k$ and all others are positive. I think from this we can deduce that at least one eigenvalue has to diverge(?) This might lead to an inductive argument in $d$, if there was a way to relate the eigenvalue of the matrices between different values of $d$ or something like that.
Rank-one modifications and Sherman-Morrison-Woodbury
I've tried looking into a variety of results about changes in eigenvalues due to rank-one modifications, and while there are some things out there (also in particular about symmetric positive definite matrices) I didn't manage to find any results that would really help here. The Sherman-Morrison-Woodbury formula might be helpful in some way but I'm not sure. I've attempted expanding $v_{k+1}$ into an orthonormal basis of eigenvectors of $G_k$ and applying Sherman-Morrison-Woodbury which simplified somewhat nicely but ultimately wasn't really helpful.
Empirical evidence suggesting linear growth rate of eigenvalues
I've ran a bunch of numerical simulations (sampling up to 500k $t_i$s from multiple real intervals using different probability distributions and calculating the resulting determinants, etc. - including negative values in some cases just to see what would happen) to try getting a grip on the problem and they suggest that the goal is true and moreover that all of the eigenvalues separately are strictly monotonically increasing and unbounded. It appears to be the case that all the eigenvalues are asymptotically linear in $k$ and that $\det G_k \in \Theta(k^d)$ (which are consistent results). However I haven't managed to prove either of those claims.
Here's an image showing the means of the ordered eigenvalues for the case $d=5$ where the $t_i$ are uniformly sampled from $[-1000, 1000]$ across 10 realizations and with $k$ ranging from $5$ to $50,000$. The indicated "Model" in brown is a simple linear function in $k$.
I feel like this should be a fairly simple result to prove but I'm kind of struggling with it. I'd appreciate any pointers towards a solution.
