Let $\bf{A}$ denote a symmetric, positive semidefinite $N\times N$ matrix with ones on the diagonal. Let further $w_i\geq 0$, $i=1,\dots,N$, be such weights that $\sum_{i=1}^N w_i=2$. Lastly, let $$ f({\bf{A}},{\bf{w}}) = \big|(N {\bf{w}} - {\bf{u}})^T{\bf{A}}{\bf{u}}\big|, $$ where ${\bf{w}}=(w_1,\dots,w_N)'$ and ${\bf{u}}=(1,\dots,1)'$. Note that ${\bf{A}}{\bf{u}}$ is a vector of row sums of $\bf{A}$. Hence, I'm trying to maximize the absolute value of a weighted sum of row sums of $\bf{A}$.
I'm interested in the following two problems (maintaining the restrictions on $\bf{A}$ and $\bf{w}$).
- Given some $\bf{A}$, $$ \arg\max_{\bf{w}}f({\bf{A}},{\bf{w}}). $$
- Now let $\bf{A}$ also be free to vary. Then, using the first problem, I suppose we can write $$ \arg\max_{\bf{A}}f({\bf{A}},{\bf{w}}({\bf{A}})). $$
Regarding 1, after some more inspection I'm quite convinced that my initial guess was right: the weights are either 0 or of the form $2/L$ (shared among $L$ "equally good" rows). However, it's not yet clear how to assign these weights; it seems to be based not only on the (absolute) row sums ordering.
As for 2, I'm certain that the maximal value of $f$ is $N^2$, which is attained at $\bf{A}={\bf{u}\bf{u}}'$ and any $\bf{w}$ satisfying the restrictions. However, I couldn't find a way to formally show that. I considered induction, relating the problems to the eigenvalues of $\bf{A}$, etc.
Regarding the first part, let us define $v = Au$. You can write down the maximization over $w$ as maximizing the following two linear functions: \begin{align} \max_w~(Nw-u)^Tv ~~\&~~ \max_w~(u-Nw)^Tv \end{align} It is not difficult to see that the maximizer of the first one is choosing $w_i = 2$ for one i that $v_i$ is maximized and setting the rest equal to zero (and the convex combination of all such points). The maximizer of the second one is choosing $w_i=2$ for one i that $v_i$ is minimized and setting the rest equal to zero (and the convex combination of all such points). So \begin{align} \max_w~f(A,w) = \max(2N\max_i(v_i)-\sum_i v_i, \sum_i v_i -2N\min_i(v_i)) \end{align}
As for the second part, I could not prove it in general, but it is not difficult to prove it in the case where the matrix $A$ is rank one. In that case, the condition on the diagonal entries of the matrix translates into its eigenvector being $a_i = \pm \frac{1}{\sqrt{N}}$. With this, it is easy to prove that the maximum happens when $a_i = \frac{1}{\sqrt{N}}$. Unfortunately, it is not easy to characterize the eigenvectors when the rank is more than one.