So, I've been playing with a formula for the normal distribution for yucks and giggles.
$$\frac {1}{\sqrt{2 \pi\sigma^2}}e^{-\frac {x^2}{2\sigma^2}}$$
And I decided to look at 10 binary trials (coin toss) and, you know, list the chances of getting no heads, one head, two heads, etc.{1,10,45,120,210,252,210,120,45,10,1}, the appropriate row of Pascal's triangle.
I put these as points on a graph, indexing them on x {-5...5}. Because this gave me a crazy-tall graph I divided everything by 252 so it would max out at 1.
(Link to Desmos graph.) https://www.desmos.com/calculator/xye7h54upl
I then figured the variance at about 0.26
Then I wanted to fit the curve of the normal distribution on the dots representing the binary trials. Playing around, I realized that I could create a perfect match if I put the right amount in the numerator and let x and y take on a set of values from the coin toss distribution.
Recasting the equation gave me:
$$m=ye^{\frac {x^2}{2\sigma^2}}\sqrt{2\pi\sigma^2}$$
Where 'm' is the factor by which I need to multiply the whole equation to get a perfect fit. Which I did! I think. When m = about 4.1, the normal distribution fits the data as perfectly as need be. (used the most precise values available to me on the graph)
BUT, to get it to work, I had to multiply the variance by 10. I don't know why.
Also, what did I just do? What does (approx) 4.1 mean? WHAT HAVE I DONE?