We have a collection of $a$ arbitrary-dimensional vectors $\{\vec{u_m}\}$, each with magnitude $\|\vec{u_m}\| = \sqrt{u_m}$. We are interested maximizing sum of the square of the dot product of every vector with every other vector. I.e. \begin{equation} \sum_{m=1}^{a} \sum_{n=1}^{a} (\vec{u_m}\cdot \vec{u_n})^2 = \sum_{m=1}^{a} \sum_{n=1}^{a} u_m u_n \cos^2 \phi_{m,n} \end{equation}
under the condition that for every $m$: \begin{equation} \sum_{n=1}^{a} \cos^2 \phi_{m,n} \leq s \end{equation} where $s$ is an integer between $1$ and $a$.
As written, this problem seeks to maximize the dot-product-squared sum for any $s$ provided the $a$ vector magnitudes $\{\sqrt{u_m}\}$, with the $\cos^2$-sum as a constraint.
Naturally, when $s=a$, all vectors may be parallel and the solution is $(\sum_m u_m)^2$. When $s=1$ all vectors must be orthogonal and the only solution is $(\sum_m u_m^2)$.
The vector formulation of this problem naturally enforces a number of relations for the angles. E.g. $\cos^2 \phi_{m,n} = \cos^2 \phi_{n,m}$. Or $|\phi_{m,n}| + |\phi_{l,n}| \geq |\phi_{m,l}|$. At a superficial level, the sum can be no greater than $(\sum_{m=1}^{a} u_m)\cdot(\sum_{n=1}^{s} u_n)$ (if we assume $u_1 \geq u_2 \geq u_3 ...$) but this far overestimates the maximum sum in most cases.
I've never seen a problem with this kind of formulation and would appreciate any insight on the problem itself or direction towards problems that may deal with vectors in a similar way. This is an optimization problem, but a nagging part of me thinks there is a closed-form solution. I have proposed one for a closely-related problem which hasn't failed yet. But I can't prove this to be the solution for either problem.
My proposal is \begin{equation} \sum_{g=1}^{a/s}\Big(\sum_{m=s(g-1)+1}^{gs} u_m\Big)^2 \end{equation}