I lack the terminology to ask this question "properly", so to illustrate what I'm bouncing around in my mind, let's take a story example:
John wrote a script which guesses passwords. He has a list of 21,600 passwords, of which one of them has been used for a login on a particular endpoint. If his computer is able to test one password each second, it will take the script an average time of 3 hours to break in (with a maximum possible running time of 6 hours).
After the script has run for two hours, he observes that the script has not yet found the correct password and is still running.
I'm wondering two things:
- Is the statement "It will take the script an average time of 3 hours to break in" correct? Is the average completion time in this scenario simply half of the total time?
- When John checks on the script after it has been running for 2 hours, will there be an average time of 1 hour be remaining, or (if uncompleted so far), will there be an average time of 2 hours remaining?
*
* 2 hours is the average time until the script has completed half of the script's remaining time at the point at which John checks it (i.e. half of 4 hours)
The password is equally likely to be found in $1$ second, $2$ seconds, $3$ seconds, and so on up to $21600$ seconds. So if we let $n=21600$, the mean time is $\frac{1}{n}(1+2+\cdots+n)$. This is $(n+1)/2$, which is $3$ hours for all practical purposes, but not exactly $3$ hours.
The same sort of minor correction applies to checking after $2$ hours. Given that the password has not been found, there are $(21600)(4/6)$ passwords to go, that is, $14400$. The mean additional time is $(14401)/2$ seconds, for all practical purposes $2$ hours.
The unconditional average time remaining after $2$ hours does not have a clear meaning. But it is reasonable to say it is $0$ with probability about $2/6$, and $2$ with probability about $4/6$, giving about $8/6$.