For a random sample $x_1, x_2, \cdots, x_n$ coming from the Gamma distribution with $\varGamma(1,\theta).$ How to prove that the $ \sum_i^nx_i $ is the complete statistics?
What I have done is that, the sum of sample follows $ \varGamma(n,\theta) $, and let $ t=\sum_1^n x_i $,then,
$$\operatorname E(g(t))=\int_0^\infty g(t)\frac{\theta^n}{\varGamma(n)}t^{n-1}e^{-\theta t} \, dt = 0$$
The problem is that I don't know how to show that the expectation equals to $0$ can imply that $P(g(t)=0)=1$.
Could anyone help to prove it?
There are two ways to approach this:
1) Note that the Gamma($1,\theta$) family is a exponential family of distributions. The parameter space contains an open set in $\mathbb{R}$. From the well known theorem, the sufficient statistic $\sum_iX_i$ is complete.
2) From the theory of Laplace transforms, $\int_0^\infty f(x)e^{-sx} \, dx = 0$ iff $f(x) = 0$ almost everywhere. In your case, it follows that,
$$ g(t)t^{n-1} = 0 \Longleftrightarrow g(t) = 0 \;\; $$
almost everywhere.