Say I have a random variable $Z$ that is itself a function of two independent random variables $g(X,Y)$, where
$$Z = g(X,Y) = \frac{X}{Y}$$
I know that the propagation of uncertainty gives a good approximation of $Var[g(X,Y)]$ using Taylor expansions around the means $\mu_X,\mu_Y$, such that, when $X, Y$ are independent,
$$Var[Z] \approx \Bigg(\frac{\delta g(x,y)}{\delta x}\Bigg)^2 \sigma_x^2 + \Bigg(\frac{\delta g(x,y)}{\delta y}\Bigg)^2 \sigma_y^2 = \frac{\sigma_x^2}{\mu_Y^2} + \frac{\sigma_y^2 \mu_X^2}{\mu_Y^4} $$
Suppose now that $Z$ is a function of two series of independent random variables, $g(X_1,...,X_k,Y_1,...Y_k)$, such that
$$Z = g(X_1,...,X_k,Y_1,...Y_k) = \frac{X_1}{Y_1} + \dotsb + \frac{X_k}{Y_k}$$
If each of the ratios are independent, I am wondering if the variance of $Z$ can be calculated by taking the sum of propagation of uncertainty for each of the fractions, i.e.,
$$Var[Z] = \sum_{i=1}^k Var\Bigg[\frac{X_i}{Y_i}\Bigg] \\ \approx \sum_{i=1}^k \Bigg[\Bigg(\frac{\delta g(x_i,y_i)}{\delta x_i}\Bigg)^2 \sigma_{x_i}^2 + \Bigg(\frac{\delta g(x_i,y_i)}{\delta y_i}\Bigg)^2 \sigma_{y_i}^2 \Bigg]$$
where for the $j^{th}$ term, I calculate partial derivatives with respect to $x_j$ and $y_j$ as in the basic case presented above and ignore the other $i \neq j$ terms? Or do I need to do out each partial derivative with respect to the complete series of $\{x_i\},\{y_i\}$ ?
I double-posted this question to the Cross Validated stack exchange and got a satisfying answer. However, I'll leave the question marked "unanswered" here in case anyone wants to propose an alternative...