Prove that for given $t \in (0,1)$ and $ x \in (0,t) $ the function $f$ will have its minimum when $x = \frac{t}{2}$
$$ f(x) = x\log(x) + (t-x)\log(t-x) $$ Therefore the minimum is $ f(\frac{t}{2}) = t\log(\frac{t}{2}) $
The reason is asking is because I want to understand why Entropy is maximized when the probability is uniformly distributed.
The negative of the Entropy is: $\sum_{i=1}^np_i\log(p_i)$
For a more complete answer check: Why is Entropy maximised when the probability distribution is uniform?
$x=\frac{t}{2}$ is the only critical point of $f$ in $(0;t)$:
$$ f'=\log(x)+1-\log(t-x)-1=\log\frac{x}{t-x} $$
so $f'=0$ means $x=t-x$.
The second derivative $f''=\frac1x+\frac1{t-x}>0$ on $(0;t)$, so $\frac{t}2$ is the only minimum.