It is known that the entropy $$H=-\sum_{i=1}^{n}p_i \log_2(p_i)$$ Is maximized when $p_i=1/n$.
However, this is under the assumption that $\sum_{i=1}^{n}p_i=1$. Does this still hold true if the sum of probabilities is $0<P<1$? That is, if $$\sum_{i=1}^{n}p_i=P$$ Note that this can happen because my goal is to optimize a wordle solver, and for that I need to get an upper bound for the entropy after part of the information has already been summed.
No, but you can still go back to the usual case. Just apply the standard case with $p_i$ replaced by ${p_i \over P}$. The entropy becomes $${1\over P} \sum p_i \log_2(p_i/P)= {1\over P} \sum p_i \log_2(p_i) -\sum p_i \log_2(P) = {1\over P} \sum p_i \log_2(p_i) -P \log_2(P).$$ Hence the maximum is attained when ${p_i \over P} = {1\over n}$.