Determining sufficient statistic for single random variable

340 Views Asked by At

just need some hint to tackle this problem:

So, we are given a random variable X with pdf: $f_X(x;\theta) = \frac{1}{2\theta}$ for $-\theta < x < \theta$, and zero otherwise.

Then, we are asked whether $|X|$ is a sufficient statistics for $\theta$.

My thoughts: First, I'm confused, since for MLE for example, we usually deal with $n$ samples of $X$, however, I'm convincing myself that we may have $n=1$ and so, no much a big deal. We can still estimate $\theta$. Then, what follows is finding the MLE of this particular likelihood function, and since this seems like a uniform distribution $\cal{U}$$ \sim [0,2\theta]$, I know the maximizing value comes from taking a look to the support of the pdf and not from the typical method of derivative, etc.

Therefore, considering that we want to maximize $\frac{1}{2\theta}$, we can chose the estimation of $\theta$ to be some value of X. To maximize such ratio, we may chose the smallest possible value of $X$, which is $-\theta$, but that will give us a negative number. Then, the next non-negative smallest value would be zero, but that will blow up $\frac{1}{2\theta}$.

So, kinda stuck and confused from here... any help?

Thanks!

1

There are 1 best solutions below

0
On

While MLEs are affected by the notion of sufficiency, it is best to think about sufficiency directly, without worrying about the MLE. The simplest thing to do is to see if you can decompose your sampling density to meet the requirements of the Fisher–Neyman factorization theorem. From your stated distribution you have the density function:

$$f_\theta(x) = \frac{\mathbb{I}(-\theta < x < \theta)}{2 \pi} = \frac{\mathbb{I}(|x| < \theta)}{2 \pi} = g_\theta(|x|).$$

This establishes the requirements of the theorem, which proves sufficiency. The implication for the MLE is that it will be a function of $x$ only through $|x|$.