Does a truly random sequence in the range of x..y average to the average of x and y?

27 Views Asked by At

I recently started wondering if the average of a truly random sequence between numbers x and y has to be the average of x and y itself.

This little JavaScript function seems to prove it (uses Math.random() which should be a pretty good source of random numbers). The bigger precision is, the closer is the result to 0.5:

function averageOfRandom(precision) { // returns the average of `precision` random numbers between 0 and 1
  const sequence = [];
  for(let i = 0; i < precision; i++) { // make the sequence
    sequence[i] = Math.random();
  }

  let totalSum = sequence.reduce((a, b) => a + b);

  return totalSum / sequence.length; // average
}

Then, is a random sequence the only sequence that can have such trait without knowing its own length? Maybe a sequence simply alternating x and y is also good?

1

There are 1 best solutions below

1
On BEST ANSWER

The notion "truly random sequence" is not a well-known rigorous concept.

However, an independent sequence of random variables, each uniformly distributed on $[x,y]$, does have mean converging to the midpoint $\frac{a+b}{2}$. This is (a special case of what is) known as the "law of large numbers".

Your guess is correct that there are many different sequences with this same limiting behavior.