Let $Y = X_1 + ... + X_n$, where the $X_i$ are independent Bernoulli random variables taking values in $\{0, 1\}$, with $P(X_i = 1) = p_i$.
We can prove that: $$E(Y) = \sum_{i=1}^n p_i \text{ and } \text{Var}(Y) = \sum_{i=1}^n p_i(1-p_i)$$
Keeping $E(Y)$ fixed, I am trying to prove that $Var(Y)$ is maximal when $p_1 = ... = p_n$.
Writing $\text{Var}(Y) = \sum_{i=1}^n p_i(1-p_i) = \sum_{i=1}^n p_i - \sum_{i=1}^n p_i^2 = E(Y) - \sum_{i=1}^n p_i^2$. Then we want to minimize $\sum_{i=1}^n p_i^2$ without changing $\sum_{i=1}^n p_i$. I'm not sure where to go from here.
I would appreciate a small hint to point me in the right direction. Thank you!
By Cauchy-Schwarz, we have that $$(\sum_{i=1}^n a_ib_i)^2 \leq \sum_{i=1}^n a_i^2 \sum_{i=1}^n b_i^2$$ Choosing $a_i = p_i$ and $b_i = 1$, we have $$(\sum_{i=1}^n p_i)^2 \leq n\sum_{i=1}^n p_i^2 \Rightarrow \frac{E(Y)^2}{n} \leq \sum_{i=1}^n p_i^2$$. If we can provide a choice of $p_1, ..., p_n$ which attains equality, then this choice minimizes $\sum_{i=1}^n p_i^2$. Suppose $p_1 = ... = p_n $, then we have that $$E(Y) = \sum_{i=1}^n p_i \Rightarrow p_1 = ... = p_n = \frac{E(Y)}{n}$$ And that $\sum_{i=1}^n p_i^2 = \sum_{i=1}^n (\frac{E(Y)}{n})^2 = \frac{E(Y)^2}{n}$.