I'm looking for a measure that can give me an estimate of the number of significant values in a sparse vector, without using a threshold.
For example, a measure $S$ might give some output like
$S([0, 0, 0, 1, 0, 1, 0, 0, 1]) = 3$ (3 of the same exactly)
$S([0, 0, 0, 1, 0, 1, 0, 0, 2]) = 2.5$ (somewhere between 3 and 1)
$S([0, 0, 0, 1, 4, 4, 4, 4, 5]) = 5.1$ (most likely 5, slight probability 6)
Basically I have a practically sparse signal consisting of one or more large spikes and lots of small spikes. The spike amplitudes have real values. The number of spikes often describes one aspect of the physical shape of the object the signal represents. I need a measure that approximates this number for use a feature in classification.
A simple example, to illustrate the concept, consider the shapes below. Imagine these are described by one spike for each vertex, with amplitude $|180 - \theta|$. The square on the left would give a measure of 4, the triangle 3 and the middle two somewhere in between.
I don't want to use a threshold to get the number of spikes, because I would like a continuous measure that hints at some information about the distribution of spike amplitudes, as I think it would help with classification.
Of course, one dimension is not enough to appropriately convey this information, so the measure could have a few more values.
The vectors are typically 360 or 720 points and are processed by a computer program.
Before I go off trying to create my own measure, is there one that already exists for this purpose?

Suggestion, not an answer.
If you are always thinking about the angles of a (convex) polygon with $n$ vertices then those angles sum to $(n-2)\pi$.
Then I think the sum (or the average) of the absolute deviations of the angles from the average $(n-2)\pi/n$ should somehow measure the distance from an integer number of spikes, which will be $n$ if all are equal in size.