Returning a decimal number between 0 to 1 for showing how large each number in a set of number is

62 Views Asked by At

I hope the title does not go so far, I just want to describe what I want simply:

I have a set of random numbers, and I want to return a decimal number from 0 to 1 to show how big (max) the number is.

Imagine this (It is just example, with not true result!):

This is the set (Array): 3580, 532, 5, 1002, 1800

Result should be something like: 1, 0.3, 0.05, 0.3, 0.42 (This is just example!)

1

There are 1 best solutions below

1
On BEST ANSWER

Test set: $\{941, 421, 204, 1482, 1142, 1468, 790 \}$

Three possibilities:

  1. Divide by the largest number - in the test set, $1482$. We'll always get a value $1$ in the result set. This gives values (to 3dp):

    $\{ 0.635, 0.284, 0.138, 1.000, 0.771, 0.991, 0.533 \}$

  2. Divide by the sum of the set - in this case, $6448$. Then we get:

    $\{0.146, 0.065, 0.032, 0.230, 0.177, 0.228, 0.123 \}$

  3. Subtract the smallest number ($204$) then divide by the resulting largest ($1278$). The results will always have a $0$ and a $1$. The test set gives:

    $\{0.577, 0.170, 0.000, 1.000, 0.734, 0.989, 0.459\}$

You might use different options in different circumstances. It could be appropriate to use these with maximum, minimum or total values that don't come from the data concerned, as well - some reference limits or total "universe" that is known separately.