How to prove the sum of sample is the complete statistics for gamma distribution?

2.2k Views Asked by At

For a random sample $x_1, x_2, \cdots, x_n$ coming from the Gamma distribution with $\varGamma(1,\theta).$ How to prove that the $ \sum_i^nx_i $ is the complete statistics?

What I have done is that, the sum of sample follows $ \varGamma(n,\theta) $, and let $ t=\sum_1^n x_i $,then,

$$\operatorname E(g(t))=\int_0^\infty g(t)\frac{\theta^n}{\varGamma(n)}t^{n-1}e^{-\theta t} \, dt = 0$$

The problem is that I don't know how to show that the expectation equals to $0$ can imply that $P(g(t)=0)=1$.

Could anyone help to prove it?

2

There are 2 best solutions below

0
On

There are two ways to approach this:

1) Note that the Gamma($1,\theta$) family is a exponential family of distributions. The parameter space contains an open set in $\mathbb{R}$. From the well known theorem, the sufficient statistic $\sum_iX_i$ is complete.

2) From the theory of Laplace transforms, $\int_0^\infty f(x)e^{-sx} \, dx = 0$ iff $f(x) = 0$ almost everywhere. In your case, it follows that,

$$ g(t)t^{n-1} = 0 \Longleftrightarrow g(t) = 0 \;\; $$

almost everywhere.

0
On

Suppose $$T = \sum_{i=1}^{n}X_i$$ and $$E[h(T)]=0, \forall t \in (0, \infty)$$ Then $$\operatorname E(g(t))=\int_0^\infty g(t)\frac{\theta^n}{\varGamma(n)}t^{n-1}e^{-\theta t} \, dt = 0$$ $$\int_0^\infty g(t)\theta^{n}t^{n-1}e^{-\theta t} \, dt = 0$$ Let $f(t)$ be a function s.t. $f(t) = 0, \forall t \in (0, \infty)$,

Then $$\int_0^\infty f(t)e^{\theta t} dt = 0,\forall \theta \in (-\infty, \infty)$$,

We get $$ 0 = \int_0^\infty f(t)e^{\theta t} dt = \int_0^\infty g(t)\theta^{n}t^{n-1}e^{-\theta t} \, dt = 0 $$

By the uniqueness property of Laplace transformation,

$$0 = f(t) = g(t)\theta^{n}t^{n-1}, \forall t \in (0, \infty)$$

Since $t > 0, \, \theta > 0$,

$$h(t) = 0 , \forall t \in (0, \infty)$$

Therefore,

$$Pr[h(t)=0] = 1$$