Statistics of dice

97 Views Asked by At

To keep up my (very basic) python skills and just generally as a hobby I programmed a program that takes a dice input and rolls in a bunch of times. As a case study, I took how a friend of mine decided to kill off one of his DnD characters he got bored of, an underlying heart condition dealing 2d100 (2 100-sided dice) damage.

So I had given myself the assignment of figuring out his chance of surviving that roll. His character had 28 HP so would survive any roll of 27 or less. Rather than figuring it out using whatever math I learned I figured I would just adapt the program to do an analysis of it.

Now I have just finished the program and for fun, I thought I'd have it graph out the results: Result of 100000 rolls of 2d100

Now, what math skills I do have had expected it to be more like a normal distribution, and I don't really see that in the graph, so I'm a little confused. I know that computers aren't great at doing random things, so I thought maybe that is something, but it's more likely my lacking statistics.

Any explanation appreciated!