Let $X,Y$ be two random variables with parametric probability densities respectively $p_X(x, \sigma_X)$ and $ p_Y(y, \sigma_Y)$ where $\sigma_X , \sigma_Y \in \mathbb{R}$ satisfy an equation of the type $f(\sigma_x) = \sigma_y$ for some function $f$. In the following, For the sake of simplicity, let's assume that $\sigma_x = 2 \sigma_y$.
Suppose that given an observation $X_1,Y_1$ of each random variable one wishes to obtain the best (in the least variance sense) estimators possible $\hat{\sigma}_x, \hat{\sigma}_y$ for $\sigma_x, \sigma_y$.
Without knowing that the true parameters satisfy the equation $\sigma_x = 2 \sigma_y$ I would take the estimators as $$ \hat{\sigma}_x = \max_\sigma p_X (X_1, \sigma), \quad \hat{\sigma}_y = \max_\sigma p_Y (Y_1, \sigma) $$ the standard maximum likelihood estimators (the variance will be high since we have only one observation but this is another matter).
If one also knew that the equation $\sigma_x = 2\sigma_y$ holds it seems reasonable to expect that one could improve the estimator taking into consideration this additional information, in the sense that I would expect to find two estimators $\tilde{\sigma}_x, \tilde{\sigma}_y$ such that $$Var[\tilde{\sigma}_x] < Var[ \hat{\sigma}_x], \quad Var[\tilde{\sigma}_y] < Var [\hat{\sigma}_y]$$
How would one go about doing this? If this is a standard problem in statistics references are very welcome.
NB: I am aware that the maximum likelihood estimators are not always the most efficient estimators, they are cited here as an example to motivate the question.