Lets say I have a database of customers with how much they spent
so something like
Jim 100
Bob 300
Tim 14
Sarah 43
Rob 945
Jake 3,000
etc
What I want to do is then assign each customer a value from 0.00 to 5.00 (so to two decimal places) based on how their number compares to the min and the max.
I think this has something to do with percentiles, but what is the formula to do this?
I guess we would start by separating the data into five quintiles. How do I determine the boundary of each quintile?
And then lets say a customer falls into the two to three quintile. How do I specify the exact two point precision of like 2.13 of 2.67 for their position in that quintile.
If the only thing you are interested in is where the customer falls within the min and the max, compute $$5 * \frac{customer - \min}{\max - \min}$$ to get a number between $0$ and $5$ for each customer. You can then round it to any number of decimal points you like.
For example, if you only have the data you listed, then $\min = 14, \max = 3000$ and so for Bob you would get $$ m_\text{Bob} = 5 * \frac{300 - 14}{3000 - 14} = \frac{1430}{2986} \approx 0.48. $$
This, however, takes no account of where the number is with respect to the other customers. If you would like to compute that, then you can compute the actual percentile for each number and multiply by $5$, rounding to $2$ decimal places, if you like.
To compute the percentiles from your data set, first sort it:
Since you have $n=6$ data points, divide the $100\%$ range into $6$ boxes, so each box will have $100/n$ (here $100/6 \approx 16.667\%$, and use the maximum (or minimum or middle) block member for each label. Then multiply by $5$ to get your label: