I'd like to find a way to detect a significant drop/decrease in a signal. Below is an actual example of what I'd like to accomplish, with the arrow denoting the change that I'd like to detect (only the red curve).
The data is fairly straightforward...the x-values are integers starting from zero and increasing by 1 at each data point. The y-values are also integers. I know that the dip I'd like to detect always occurs after the minimum value (denoted by the small circle). However, I'm not sure of the best way to find this drop.
What's the best methodology or algorithm for a situation like this?

I think is coherent with the context (Heat in oven) use a logarithmic regression: $T=k*\ln(t)+C$
You can do the following: As in the picture take for example five temperatures $(t_0,T_0),(t_1,T_1),(t_2,T_2),(t_3,T_3),(t_4,T_4)$, with regular periodicity. Then you must obtain the logarithmic regression of the data. In the picture we obtain aproximately temperature and time. How it isn't reference I used the grid for units.
If the following medition $(t_5,T_5)$ is not accord with the estimation of the regresion you have possibly a drop. If not you delete the initial medition $(t_0,T_0)$ and you take the new data $(t_5,T_5)$ and $(t_1,T_1),(t_2,T_2),(t_3,T_3),(t_4,T_4)$to obtain the regression again...and so on.
As you can see the ajust is very fine (OpenOffice-Calc says to me regresion coeficient is equal 1).