Derived parameter instead of parameter estimation

43 Views Asked by At

In a statistical model $$ (\mathcal X,\mathcal F, (P_{\theta})_{\theta\in\Theta}) $$

we call $$ \varphi(\theta)\in\mathbb R^d $$

a derived parameter (literal translation from German, Source: Methoden der Statistik, p. 18). In the definition, the estimator is actually not estimating a parameter $\theta$ but a function of the parameter $\varphi(\theta)$ and an estimator $\hat{\varphi}$ is mapping not into the parameter space $\Theta$ but into the image $\varphi(\Theta)$. If we take $\varphi\equiv \rm{id}_{R^d}$ then the derived parameter and parameter coincide.

The authors claim that this is a very convenient way to get rid of computational problems.

For example if we have $$ (X_i)_{i=1}^n \text {iid sample},X_1\sim N(\mu, \sigma^2) $$

so $(\mu, \sigma^2) \in \mathbb R \times \mathbb R^+ =: \Theta$.

Now if we are only interested in $\mu$, the function $$\varphi(\theta)=\varphi(\mu, \sigma^2)=\mu$$ and we don't have to bother with the variance any longer and we would just take the arithmetic mean (which wouldn't be an estimator since it's not mapping into $\mathbb R^2$) as an estimator for $\varphi(\theta)$.

Question: Is the definition of a derived parameter a common thing to do? I have never encountered it before and couldn't find any further sources.

In the example above, if we would only be interested in the mean we could just adapt the statistical model to get mathematically rigor.