We measure voltage across a resistor and find that the measured samples are distributed normally with mean $E[X]$ and variance $\sigma_X$, we measure current across a resistor and find that the measured samples are distributed normally with mean $E[Y]$ and variance $\sigma_Y$. It is given that both measurements are independent of each other. Random variables X and Y are independent.
What is the mean and variance of Z which represents the resistance? Since $V = IR$,
$$Z = \frac{X}{Y}$$
This question is inspired from the GATE 2019 Instrumentation paper which had the following question,
So I would like to specifically find the $Var(Z)$ if $E[X] = 1$, $E[Y] = 10^{-3}$, $Var(X) = (0.12)^2$ and $Var(Y) = (0.05 \times 10^{-3})^2$
My attempts
From my googling, I find that
There is no formula expressing $E[\frac{1}{Y}]$ or $var(\frac{1}{Y})$ in terms of $E[Y]$ and $var(Y)$
So my initial attempts as shown below failed.
$$var(XY)=E(X^2Y^2)−E(XY)^2=var(X)var(Y)+var(X)E(Y)^2+var(Y)E(X)^2$$
The pdf is not given to us, so I can't manually integrate and find the expectation.
Since this question appeared in a prestigious national exam, I am confident a solution exists. Further this question was only worth 1 mark so I am expecting an elegant solution to exist.
Can we think of standard deviation as percent error? and simply add it?


What you're expected to do here is use the delta method, affecting the approximation $\operatorname{Var}f(X_i)\approx\sum_i\left(\frac{\partial f}{\partial X_i}\right)^2\operatorname{Var}X_i$. In this case, $\sigma_Z^2\approx\frac{\sigma_X^2}{Y^2}+\frac{X^2\sigma_Y^2}{Y^4}$.