I started a discussion with some friends trying to understand: if I roll 1 dice of 10(1d10) and 1 dice of 4(1d4) 20 times, and do it again 1000 times. Why the d4 have the probability to score the average(50) or more, more times than the d10???
I also wrote a java code to try to understand that take a look:
double totalDfour = 0;
double plusDfour = 0;
double avarageDfour = 50;
int dfour = 0;
double totalDten = 0;
double plusDten = 0;
double avarageDten = 110;
int dten = 0;
Random generator = new Random();
for (int y = 1; y <= 1000; y++) {
plusDten = 0;
plusDfour = 0;
for (int x = 1; x <= 20; x++) {
dfour = generator.nextInt(4) + 1;
Log.d("DADO", " d4: " + dfour);
plusDfour = plusDfour + dfour;
dten = generator.nextInt(10) + 1;
Log.d("DADO", " d10: " + dten);
plusDten = plusDten + dten;
}
if (plusDten >= avarageDten) {
totalDten++;
}
if (plusDfour >= avarageDfour) {
totalDfour++;
}
}
Log.d(TAG, "Total d10= " + totalDten + " Total d4= " + totalDfour);
And always totalDfour is bigger than totalDten, and i would like to understand why if both have the same probability to score the average number or more.
This is your problem right here. A Rookie programming mistake.
Algorithmic random number generators are not truly random, they are really algorithms that generate a fixed but random-looking sequence of numbers.
Since you are not seeding your generator you are using the same not-really-random sequence every time you run the code.
The classic way to get an approximately distinct sequence is to seed the generator with execution time.
Try using: