I invented a fun exercise:
Suppose X and Y are discrete independent random variables over $\mathbb{N}$ such that
$$\forall a, b \in \mathbb{N}, ~~\mathbb{P}(X*Y = a*b) = \mathbb{P}(X=a) *\mathbb{P}(Y=b)$$
It is possible to show that $X = Y = 1$.
Now, I'm trying to see whether this holds if I remove the independence assumption.
One can show:
- $f(x) := \mathbb{P}(X=x) = \mathbb{P}(Y=x)$
- $f(1) \neq 0$
- $g(x) := \frac{f(x)}{f(1)}$ is completely multiplicative
- $0 < \sum_{i \in \mathbb{N}} g(i) = \prod_{p \in \mathbb{P}} \frac1{1-g(p)}= \frac1{f(1)} < \infty$
I didn't find nice properties on the joint distribution $\mathbb{P}(X=x \land Y=y)$.
Can you find examples other than $X=Y=1$ or prove they don't exist?
EDIT: thank you @Thomas Andrews for pointing out we need $X,Y \in \mathbb{N}$
The only such random variables are still identically 1.
We have, \begin{align} 1 &=\sum_{a \in \mathbb{N}} \mathbb{P}(X\cdot Y=a \cdot 1)\\ &=\sum_{a \in \mathbb{N}}\mathbb{P}(X=a)\mathbb{P}(Y=1)\\ &=\mathbb{P}(Y=1). \end{align} Similarly, switching the roles of $X$ and $Y$, \begin{align*} 1&=\sum_{a \in \mathbb{N}}\mathbb{P}(X\cdot Y = 1\cdot a)\\ &=\sum_{a \in \mathbb{N}} \mathbb{P}(X=1)\mathbb{P}(Y=a)\\ &= \mathbb{P}(X=1). \end{align*}
This is assuming Thomas Andrew's point that you must restrict $a$ and $b$ to be in $\mathbb{N}$, not $\mathbb{Z}$.