Mean of overcooking time

191 Views Asked by At

This question came up this week when I had to put my rice in the microwave for a third time.

Suppose the perfect cooking time for a meal is given by a random variable $X$ with values in seconds. Now suppose a quick check allows to determine if the food is :

  1. Uncooked,
  2. Perfectly cooked,
  3. Overcooked.

What is the estimated overcooking time in seconds if one uses the following technique :

Start by cooking for $T$ seconds, then

a) Check food state.

b) If food if perfectly cooked or overcooked, stop.

c) If food is uncooked, double the last $T$ used.

An answer could also hint for a better technique or optimize the choice of $T$.

EDIT : As suggested bellow, let us assume that $X\sim N(\mu;\sigma^2)$.