How to calculate the error due to expected value?

52 Views Asked by At

I am currently trying to work out the probability of winning a game. This is proportional to another value however the I dont have the true value of this other quantity. Instead it is the sum of values randomly generated from a uniform distribution for a known number of samples and I am using the expected value of that distribution multipled by the number of times it is sampled to work out the value. How do I calculate the error associated with assuming it is the expected value. For example, if it is only sampled once, and the expected value is 5.5, the true value could be 1 if that value is drawn from the uniform distribution

In essence: $$True value=sample size*expected value$$

However how do I calculate the error of true value by assuming expected value, I'm assuming it is related to sample size similar to standard error