Equivalent estimators and transformation of joint density

28 Views Asked by At

I am currently reading a paper about acceptance sampling (Sampling Plans for Inspection by Variables, G. Lieberman & G. Resnikoff), where I have come across some things I am not fully sure as to how I would show/argue that. I will first give some context before I mention where my problems arise.

$\textbf{Setting the stage}$

Let $x_1,x_2,...,x_n$ be an i.i.d sample where $x_i\sim N(\mu,\sigma^2)$. We wish to estimate

$$p=1-\int_{L}^{U}\frac{1}{\sqrt{2\pi}\sigma}\exp\left(-\frac{1}{2\sigma^2}(z-\mu)^2\right)\ dz,$$

where the mean $\mu$ is $\textbf{unknown}$, and the variance $\sigma^2$ is $\textbf{known}$. $L$ and $U$ are simply some given lower and upper limits.

The authors proceed to consider the unbiased estimate $\tilde{p}'=a/n$, where $a$ is the number of defects ($x\not\in [L\ U]$) and $n$ is the sample size.

They then mention that $\hat{p}=\mathbb{E}[\tilde{p}'\ |\ T]$ is the unique minimum variance unbiased estimate of $p$, where $T$ is the sufficient statistics for the normal distribution (they reference an article that proves this).

$\textbf{The first problem}$

The authors then claim that $\hat{p}$ is equivalent to $\mathbb{E}[\tilde{p}\ |\ T]$, where $\tilde{p}$ is defined as follows: Let $y$ denote any observation in the sample, for instance $x_1$ (this will be considered throughout). Then

$$\tilde{p}(y)=\begin{cases}0,\ L\leq y\leq U\\ 1,\ y\not\in [L\ U]\end{cases}.$$

I am not exactly sure as to how they are equivalent. What does it mean for two estimators to be equivalent? I have tried calculating the expectancy of $\tilde{p}'$ and $\tilde{p}$, and they have the same expectancy, $p$, but I don't think it's a sufficient argument.

$\textbf{The second problem}$

Later in the paper, the authors consider the joint density of $y$ and $\bar{x}'$

$$g(y,\bar{x}')=\frac{\sqrt{n-1}}{2\pi\sigma^2}\exp\left(-\frac{1}{2\sigma^2}\left[(y-\mu)^2+(n-1)(\bar{x}'-\mu)^2\right]\right),$$

where $\bar{x}'=\frac{1}{n-1}\sum_{i=2}^{n}x_i$ (I believe they do this so that $y$ and $\bar{x}'$ are independent compared to just using $y$ and $\bar{x}$). I have also managed to derive this.

They now introduce the transformation (which I can see is valid)

$$\bar{x}=\frac{(n-1)\bar{x}'}{n}+\frac{y}{n}$$

from which they arrive at the joint density for $y$ and $\bar{x}$ using the above transformation

$$g(y,\bar{x})=\frac{n}{2\pi\sigma^2\sqrt{n-1}}\exp\left(-\frac{n}{2\sigma^2}\left[(\bar{x}-\mu)^2+\frac{(y-\bar{x})^2}{n-1}\right]\right).$$

I have tried for a while now using the transformation, but I cannot obtain this expression. My approach has simply been to try and rewrite the exponent (square brackets) in the exponential function using the above transformation. The closest I have gotten is to transform the exponent into the following:

$$(y-\mu)^2+(n-1)(\bar{x}'-\mu)^2\rightarrow (y-\mu)^2+\left(\frac{n\bar{x}-y}{\sqrt{n-1}}-\sqrt{n-1}\mu\right)^2,$$

but after this I have gotten stuck. Any hints are highly appreciated.

If you need additional information, please let me know.

Thanks in advance!

$\textbf{Edit:}$

I now have been able to achieve

$$\frac{\sqrt{n-1}}{2\pi\sigma^2}\exp\left(-\frac{n}{2\sigma^2}\left[(\bar{x}-\mu)^2+\frac{(y-\bar{x})^2}{n-1}\right]\right).$$

I then forgot that when changing variables of a probability density function: $f_y(y)=f_x(x)\left|\frac{\partial x}{\partial y}\right|$, which in my case yields

\begin{align*} g(y,\bar{x})&=g(y,\bar{x}')\left|\frac{\partial \bar{x}'}{\partial \bar{x}}\right| \\ &=\frac{\sqrt{n-1}}{2\pi\sigma^2}\exp\left(-\frac{1}{2\sigma^2}\left[(y-\mu)^2+(n-1)(\bar{x}'-\mu)^2\right]\right)\frac{n}{n-1} \\ &= \frac{n}{2\pi\sigma^2\sqrt{n-1}}\exp\left(-\frac{n}{2\sigma^2}\left[(\bar{x}-\mu)^2+\frac{(y-\bar{x})^2}{n-1}\right]\right). \end{align*}

1

There are 1 best solutions below

2
On

For the first problem, I think they are simply saying that the two estimators are equal.

Bear in mind, this is not saying that $\tilde p' = \tilde p$, but that their conditional expectation wrt the sufficient statistic $T = \dfrac 1n \sum_{i=1}^n x_n$ is the same.

Actually showing that their expectation is $p$ is enough. Indeed this shows that they are unbiased estimators. By the Lehmann-Scheffé theorem, the conditional expectation of an ubiased estimator wrt a sufficient statistic is the minimum variance unbiased estimator, which is unique. So they must be the same.