Define the following function:
$$\Delta(p,y) = \sum_{i = 1}^n p_i y^T M_i^T M_i y - \frac{1}{n} \sum_{j = 1}^n y^T M_j^T M_j y$$ $$\Delta(p,y) = y^T \left(\sum_{i = 1}^n p_i M_i^T M_i - \frac{1}{n} \sum_{j = 1}^n M_j^T M_j\right) y$$
where $p$ lives on the $n$-simplex ($p_i > 0, \sum_i p_i = 1$). The matrix $M_i^T M_i$ is symmetric, thus has non-negative eigenvalues, and is of size $d \times d$. $y$ is a real-valued vector of size $d$, furthermore, we may assume that $|| y ||_2 \leq \Lambda$. Is it then possible to compute, approximate or bound
$$\max_{y} \min_{p} \Delta(p,y)$$
I was thinking to apply a minimax theorem (such as this one), however, the maximization with respect to $y$ is generally not concave, since if we subtract the matrices the resulting matrix might become indefinite, and therefore it seems I cannot apply the theorem (the minimization in $p$ is convex though).
I'm also particularly interested under which conditions $\max_{y} \min_{p} \Delta(p,y) < 0$ or $\max_{y} \min_{p} \Delta(p,y) > 0$. Any other conditions under which I could compute the quantity (would mixed strategies help?) would also be very helpful.
Any help would be greatly appreciated!
Since $\bar p=\big(\frac1n,\frac1n,\ldots, \frac1n\big)$ belongs to the probability simplex, we have the estimate for any $y$ $$ \min_{p}\Delta(p,y)\le\Delta(\bar p,y)=0, $$ which leads directly to $$ \max_y\min_{p}\Delta(p,y)\le 0. $$ Now $y=0$ makes the LHS be zero, which is clearly the maximum, hence