I would like to come up with a measure of how diverse the population is. I have 3 groups of people: White, Black, Chinese. Let $\pi_i$ denote the proportions of the total population e.g. $$\pi_\text{white}=\frac{\text{white}}{\text{white}+\text{black}+\text{chinese}}$$
I want to come up with a measure to see how diverse a population is. This meanse that if $$\pi_\text{white}=\pi_\text{black}=\pi_\text{chinese}=\frac{1}{3}$$ we let $d=1$ (diversity)
However if $$\pi_\text{white}=1,\pi_\text{black}=\pi_\text{chinese}=0$$ the diversity $d=0$.
You can smooth out the line in any way possible.
Edit: I'm thinking of a function $f(\pi_i)$ which peaks at 0.33 for $f(0.33)$ and $f(0)=f(1)=0$. Then I can define the diversity measure to be $\sum_i f(\pi_i)$. What would this function be? Something that looks like $f(x)=-x(x-1)$, but with a different maxima. Also I'd like to generalise it and do it for $n$ categories.
Sounds like you want (information theoretical) entropy
$$H = \sum -\pi_i \log(\pi_i)$$
That has a maximum when all of the $\pi_i$'s are equal, and approaches zero as one of the $\pi_i$s heads to $1$ and the others head towards $0$. The maximum is $\log(n)$, so you'll have to divide by that to get the normalization you want.
(Notes: Since you'll be normalizing it doesn't matter what base you use for your $\log$s, though it's traditionally log base 2 in info. theory. Also, if one of the $\pi_i$ equals $0$ it contributes $0$ to the sum, which is consistent with $\lim_{x \to 0^+} x\log x = 0$.)