So I'm trying to create a python module to be used for Error Propagation just for sake of learning.
let: $$A=a \pm \delta a$$ $$B=b \pm \delta b$$ As explained here Error Propagation of a function as shown $$y=f(A, B) = A \times B$$ is $$\delta y= \sqrt{ (\frac{\delta a}{A})^2 + (\frac{\delta b}{B})^2}$$
I understand this part. But what if one of the values is 0. So let temperature measurements be: $$T_1=1 \pm 0.5$$ $$T_2=0 \pm 0.7$$
When I multiply these two values I'll get division by zero naturally.
How to solve this problem?
When one of the values is zero, you can't use relative errors. Then use
$$AB\,\delta y=\sqrt{B^2(\delta a)^2+A^2(\delta b)^2}.$$