We have a real symmetric positive semidefinite matrix $M_0 \in \Bbb R^{d \times d}$ and a given set $K \subset \Bbb R^d$ that contains $|K|$ $d$-dimensional vectors, and it spans $\Bbb R^d$. At each round $t$, we choose $$k_t = \arg\max_{k \in K} k^T M_t^{-1}k$$ which is equivalent to choosing $$k_t = \arg\max_{k \in K} \det\left(M_t + kk^T\right)$$ and, after that, $$M_{t+1} = M_t + k_tk_t^T$$ I want to ask whether from that we can get if we continually doing this as t growing, the resulting $$M_t = M_0 + \sum_{\tau=1}^{t-1}k_{\tau}k_{\tau}^{T}$$ 's minimum eigenvalue can (or with high probability) grow linealy with t, i.e., $\lambda_{\min}(M_t) \geq ct$ for some constant $c>0$, after $t\geq t_0$ for some $t_0>0$?
Further, if $M$ is a random matrix and it changes with $t$, at every time $t$, also add a random $x_tx^T_t$ to $M_t$, where $x_t$ is a random d-dimensional vector with $\|x_t\|_2=1$, i.e., $$M_t=M_0+\sum_{\tau=1}^{t-1}x_{\tau}x_{\tau}^T+\sum_{\tau=1}^{t-1}k_{\tau}k_{\tau}^{T}$$ the choice of $k$ is still as above at each time $t$, can we still get that with high probability $\lambda_{\min}(M_t) \geq ct$ for some constant $c>0$ after $t\geq t_0$ for some $t_0>0$? Or can we get the minimum eigenvalue of the expectation of $kk^T$ is greater than $0$? i.e., $\lambda_{\min}(E[kk^T])>0$ ?