A quick question; is it possible to say in a way analogous to the single variable case that a multivariable function is "asymptotically equivalent" to a second multivariable function? For example, consider the function of $n_1, n_2 \in \Bbb R$ given by
$$\operatorname{Var}(\hat{\mu}) = \frac{\sigma^2(n_1 + 2n_2)}{(n_1 + n_2)^2}.$$
where $\sigma^2$ is a constant.
Can we say that $\operatorname{Var}(\hat{\mu}) \sim \frac{1}{n_1 + n_2}$ and then conclude that $\operatorname{Var}(\hat{\mu}) \to 0$ as $n_1 \to \infty$ and $n_2 \to \infty$?
Edit: Wolfram alpha concludes that the limit $$\lim_{(x,y) \to (\infty, \infty)}\frac{x + 2y}{(x + y)^2}$$
does not exist. Am I wrong to think of $\operatorname{Var}(\hat{\mu})$ as a function of two variables?
The two previous answers are perfectly right, thus I just would like it emphasize the statistical perspective. Note that when you are talking about a variance of an estimator, $n_1$ and $n_2$ are sample sizes, hence they are (strictly) positive integers, i.e., $n_1, n_2 \in \mathbb N$. As such, when you take the limit, you cannot consider any possible route (as Wolfram does) in $\mathbb{R}^2$. The extreme cases in this variance are $n_1 >> n_2$ or $n_2 >> n_1$, that don't change nothing asymptotically.
Furthermore, you can conclude that your estimator $\hat{\mu}_{n_1,n_2}$ converges in probability to some constant $\mu$, i.e., a consistent estimator of what-ever you are estimating (assuming that it is an unbiased estimator or the bias vanishes asymptotically). This stems from the fact that if $\lim_{n\to\infty}MSE(\hat{\mu_n})=0$, then $\hat{\mu}_n \xrightarrow{p}\mu$.