Derivation of pmf from convolution

912 Views Asked by At

Suppose that a discrete random variable (with finite support) $Y$ is given by $Y = X_1 - X_2$, where $X_1$ and $X_2$ are both discrete random variables with finite support and with the same probability mass function. Is it possible to determine the pmf of $X_1$ from the pmf of $Y$?

3

There are 3 best solutions below

7
On BEST ANSWER

I will consider the case when $X_1$ and $X_2$ are independent identically distributed and integer-valued.

The following properites hold:

  1. Since $Y$ is discrete with finite support, its pmf is a polynomial in $z$ and $z^{-1}$ with non-negative coefficients. $$ Z_Y(z) = \sum_{k=m(Y)}^{n(Y)} \mathbb{P}(Y=k) z^k $$
  2. Let $X$ be the purported solution with $Z_X(z) = \sum_{k=m(X)}^{n(X)} \mathbb{P}(X=k) z^k$.

  3. Since $Y = X_1-X_2$, for $X_1$, $X_2$ iids, $Z_Y(z) = Z_{X}(z) Z_{X}(z^{-1})$.

The problem you are posing is whether, given $Z_Y(z)$ one can determine $Z_X(z)$ such that $Z_Y(z) = Z_X(z) Z_X(z^{-1})$. Observe that the solution, if exists, is not unique, since $Z_{X^\prime}(z) = z^k Z_{X}(z)$ would also be a solution for arbitrary $k \in \mathbb{Z}$.

The necessary condition for existence of a solution is that $Z_Y(z)$ should verify $Z_{Y}(z) = Z_Y(z^{-1})$. Assuming that the given $Z_Y(z)$ satisfies this property, finding $Z_X(z)$ reduces to polynomial factorization.

2
On

No. If $X_1$ and $X_2$ are identically equal to any constant, then $Y$ is identically equal to zero.

0
On

The answer below is no longer relevant since the OP now says that $X_1$ and $X_2$ take on strictly positive integer values and are also independent.


If you don't like Byron Schmuland's answer with degenerate random variables, here is an example with Bernoulli random variables showing that the information you have is insufficient to determine the identical marginal distributions of $X_1$ and $X_2$. More information about the joint distribution is needed.

Let $Y$ have pmf $p_Y(-1) = p_Y(+1) = 0.2, p_Y(0) = 0.6$. $X_1$ and $X_2$ thus are Bernoulli random variables. The two joint distributions shown below are consistent with $Y$. Both give rise to identical marginal distributions for $X_1$ and $X_2$, but the marginal distributions are different in the two cases:

$$p(0,1) = p(1,0) = 0.2; p(0,0) = p(1,1) = 0.3; \Rightarrow X_1, X_2 \sim \text{Bernoulli}(0.5)$$

$$p(0,1) = p(1,0) = 0.2; p(0,0) = 2p(1,1) = 0.4; \Rightarrow X_1, X_2 \sim \text{Bernoulli}(0.4)$$

In fact, for the given pmf of $Y$, we can choose to make $X_1, X_2 \sim \text{Bernoulli}(p)$ for any $p \in [0.2,0.8]$.