Assume a task T has minimum time of 8 days to complete, maximum time of 14 days, and a most likely time of 10 days (mode). The standard deviation is 1.00 ($\frac{Max-Min}{6}$) and mean is 10.3333 ($\frac{Min+4\times MostLikely+Max}{6}$) according to PERT in a beta distribution.
I want to model this with basic Excel functions and then randomly generate many simulated actual T completion times.
Let's say in a single iteration, I randomly generate an actual task completion time of 11 days. Do I simply convert this into a probability? How?
Can I assume a normal distribution and then use Excel's NORM.DIST formula?
I'm having trouble framing the question the right way. For example, let's say in another iteration, the actual completion time was 14 days (which was the estimated maximum time possible for T). If I used NORM.DIST(x=14, mean=10, standard_dev=10.33, cumulative=1), I get 0.6507, which I don't understand. Shouldn't it be 100% that all completion times are 14 or lower? In other words, I can't work out the intuition for framing the question properly.