Can the sum of squared probabilities, i.e. $\sum_c p_c^2$, be considered to be a measure of uncertainty? If so, does it have a mathematical name or theory?
The form is similar to Shannon's entropy, $-\sum_c p_c \log(p_c)$. It satisfies $\frac{1}{C} \leq \sum_c p_c^2 \leq 1$, partly according to: Is there some general lower bound for sum of squared probabilities?, which shows that the lower bound is satisfied when $\vec{p}=\frac{1}{C}(1,\cdots,1)$. $\sum_c p_c^2$ never exceeds 1, since $\sum_c p_c=1$ and $p_c^2 \leq p_c$ ($\because 0 \leq p_c \leq 1$). An obvious case for the upper bound is $\vec{p}=(0,\cdots,0,1,0,\cdots,0)$. Taking them together, it seems like that it can be considered to be a measure of uncertainty.
It is appreciated if anyone could provide any names/theories. Best,
It is commonly referred to as the "collision probability," and, by taking the logarithm, you get the collision entropy (which is a Rényi entropy of order 2).
To see why this is called the collision probability (which is the squared $\ell_2$ norm of the probability mass function), note that if you have two i.i.d. random variables $X,Y$ with probability mass function $p$, then $$ \mathbb{P}\{X=Y\} = \sum_{c} \mathbb{P}\{X=c, Y=c\} = \sum_{c} \mathbb{P}\{X=c\}\mathbb{P}\{Y=c\} = \sum_{c} p_c^2 = \|p\|_2^2 $$
As you mentioned, it is minimized for the uniform distribution on the domain, since, if $p$ has a domain of size $n$ and we denote by $u$ the uniform distribution on this domain, then $$ 0 \leq \|p-u\|_2^2 = \sum_c (p_c-u_c)^2 = \sum_c p_c^2 - \frac{2}{n}\sum_{c} p_c + \frac{1}{n} = \sum_c p_c^2 - \frac{2}{n} + \frac{1}{n} = \|p\|_2^2 - \frac{1}{n} $$ and so $\|p\|_2^2 \geq \frac{1}{n} = \|u\|_2^2$. (Note that as a corollary, it exactly captures the $\ell_2$ distance to the uniform distribution over the domain.)