I am tring to prove that:
$K = 2$:
$$ \sum_{i < j} p_i \cdot p_j \leq {N \choose 2}(\frac 1N \sum_{i} p_i)^2$$
$K = 3$:
$$ \sum_{i < j < m} p_i \cdot p_j \cdot p_m \leq {N \choose 3}(\frac 1N \sum_{i} p_i)^3$$
where $0 \leq p_i \leq 1$ for $i=1..N$
I simulate these and it seems to be correct even for larger $K$'s.
I think that proving the concavity of the function $\sum_{i \ne j} p_i \cdot p_j$ and $\sum_{i \ne j \ne m} p_i \cdot p_j \cdot p_m$ suffice and computed the hessian but don't see a way to prove it semi-negative.
I checked and the function is not concave.
Let $f_0(x_1,...,x_N) = 1$ and: $$f_k(x_1,...,x_N) = \sum_{i=1}^N x_i\cdot f_{k-1}(x_1,.\hat{x}_i,..,x_N) $$
where $\hat{x}_i$ means that $x_i$ does not apper.
The claim is proved if we will show that $f_k(x_1,...,x_N)$ is maximized on $f_k(\alpha,...,\alpha)$ for a fixed $\alpha$ where the maximization is done on vector $\vec{x}$ such that $\sum_i x_i = C$.
We use lagrange multipliers:
Let $$L(\vec{x},\lambda) = f_k(\vec{x}) + \lambda\cdot(\sum_i x_i - C)$$
$\frac{dL}{dx_i} = f_{k-1}(x_1,.\hat{x}_i,..,x_N) - \lambda\cdot C$
Thus: $f_{k-1}(x_1,.\hat{x}_i,..,x_N) = C'$ and from this and the symmetry it follow that $x_i=\alpha$ for all $i$.