My professor claimed that $$k\sum_{i=1}^k v_i v_i^T-\Big(\sum_{i=1}^k v_i\Big)\Big(\sum_{i=1}^k v_i^T\Big)\succeq 0,$$ holds for any family of vectors $\{v_1,\dots,v_k\}$, and can be shown using the Cauchy Schwarz inequality on the quadratic form.
I'm unsure whether it's necessary assume: $k$ is a positive integer, and $v_i$ are vectors of ones and zeros such that $\sum_{i=1}^k v_i=\vec{1}$. I don't think need to assume this due to the claim that it holds for any family of vectors.
In trying to prove that the above is positive semidefinite, I get the quadratic form $$\begin{align} k\sum_{i=1}^k x^T v_i v_i^T x-x^T \Big(\sum_{i=1}^k v_i\Big)\Big(\sum_{i=1}^k v_i^T \Big)x &= k\sum_{i=1}^k x^T v_i v_i^T x-|\langle \sum_{i=1}^k v_i, x\rangle|^2\\ &\geq k\sum_{i=1}^k x^T v_i v_i^T x-\|x\|^2 \bigg\|\sum_{i=1}^k v_i\bigg\|^2\\ &\equiv k\sum_{i=1}^k x^T v_i v_i^T x-x^Tx n\\ &= x^T\big(k\sum_{i=1}^k v_i v_i^T-n\mathbb{I}\big)x\\ \end{align}$$ where $n\geq k$. I do not think this matrix in the parentheses is positive semidefinite, since its diagonals are negative. Can someone help me prove the claim of my professor?
I guess that the vectors are row vectors of an $\mathbb{R}^n$. If the vectors are column vectors, take transposes of the difference and then follow the proof below, to prove that this transpose is positive semi-definite. Then, since $A$ is positive semidefinite if-f $A^{T}$ is, the desired result follows in the case of column vectors too. In the case of row vectors the inequality is equivalent to proving that
$$k\sum_{i=1}^k||u_i||^2\geq \left|\left|\sum_{i=1}^ku_i\right|\right|^2.\ \ \ (1)$$
Indeed, note that for the difference in your question the matrix multiplication $x($this difference$)x^{T}$ is equal to
$$||x||^2\left(k\sum_{i=1}^k||u_i||^2-\left|\left|\sum_{i=1}^ku_i\right|\right|^2\right).$$
Examining $(1)$ at each coordinate, we conclude that it suffices to show that
$$k\sum_{i=1}^ka_i^2\geq \left(\sum_{i=1}^ka_i\right)^2\ \ \ (2)$$
for any $k$-tuple $(a_1,\ldots,a_k)$ of real numbers. Indeed, then we will have that
$$k\sum_{i=1}^ku_{ij}^2\geq \left(\sum_{i=1}^ku_{ij}\right)^2,\ \forall j=1,\cdots,n$$
and $(1)$ will follow by summing over all $j$'s. Now, $(2)$ can be easily proved by applying C-S to the vectors $(1,\ldots,1)$ and $(a_1,\ldots,a_k).$ An other proof of $(2)$ can be obtained by applying Jensen's inequality to the convex function $x\mapsto x^2.$