I've been solving problems for my class of statistics, but have been struggling with this problem for a while now:
This is from Hogg, 7th edition:
Let $Y_1 < Y_2$ be the order statistics of a random sample of size 2 from a distribution of the continuous type which has pdf $f(x)$ such that $f(x) > 0$, provided that $x \geq 0$, and $f(x) = 0$ elsewhere. Show that the independence of $Z_1 = Y_1$ and $Z_2 = Y_2 − Y_1$ characterizes the gamma pdf $f(x)$, which has parameters $\alpha = 1$ and $\beta > 0$. That is, show that $Y_1$ and $Y_2$ are independent if and only if $f(x)$ is the pdf of a $\Gamma(1, \beta)$ distribution.
Hint: Use the change-of-variable technique to find the joint pdf of $Z_1$ and $Z_2$ from that of $Y_1$ and $Y_2$. Accept the fact that the functional equation $h(0)h(x + y) ≡ h(x)h(y)$ has the solution $h(x) = c_1e^{c_2x}$, where $c_1$ and $c_2$ are constants.
I need help understanding the hint. I mean, why is that $h(0)$ plays a role in this problem?
So far, I'm able to obtain the joint PDF:
$h(z_1,z_2) = 2 f(z_1 +z_2) f(z_1) $, for $0<z_1<z_2$.
Also, given the characterization of the Gamma distribution, I think that $f(x) = \beta e^{-\beta x}$.
Now, I just cannot understand how the hint may be used.
Assume that $Y_1$ and $Y_2$ are independent, with PDF $g_1$ and $g_2$ respectively. In your notations, this means that $h(x,y)=g_1(x)g_2(y)$, that is, $$2f(x+y)f(x)=g_1(x)g_2(y)\tag{1}$$ In particular, for $x=0$, $(1)$ reads $$2f(y)f(0)=g_1(0)g_2(y)$$ hence $$f=g_2\tag{2}$$ Coming back to $(1)$, $(2)$ yields $$2f(x+y)f(x)=g_1(x)f(y)$$ In particular, for $x=y=0$, $$2f(0)=g_1(0)$$ Using $(1)$ again but this time for $y=0$, $(2)$ yields $$g_1(x)f(0)=2f(x)^2$$ thus $(1)$ now implies $$2f(x+y)f(x)f(0)=(g_1(x)f(0))g_2(y)=2f(x)^2f(y)$$ that is, $$f(x+y)f(0)=f(x)f(y)$$ as desired.