I am not a mathematician and I would love if you could explain some things to me, please.
I have a data, a list of some values. Minimal value is -3.04, max value is 2.75. I plotted the data and I have seen it looks like a Gaussian curve, the mean is around 0.0001 and std is about one.
The problem is I don't know how to calculate what area under the curve some value from - infinity to value point taken.
For an example: let's say, I get new input -3.7. How could I calculate a percentage of an area between -infinity to -3.7 is in my data? I would appreciate a simple explanation. Perhaps you can just direct me what to study. Thank you.
p.s. values are continous