System of integral equations describing probability

262 Views Asked by At

During my work on my thesis, I've stumbled upon the following problem:

Let $f_1$ and $f_2$ be some arbitrary PDFs with support $\mathbb{R}$. Does there exist a joint bivariate distribution $f_r(x, y)$, such that the marginal distribution of both $X, Y \sim f_r$ is $f_1$,and the distribution of $X+Y$ is $f_2$?

Writing this problem down as a system of integral equations, I've come up with this :

$$ \begin{cases} f_1(x) = \int_{-\infty}^{\infty} f_r(x, y) dy \\ f_1(y) = \int_{-\infty}^{\infty} f_r(x, y) dx \\ f_2(z) = \int_{-\infty}^{\infty} f_r(t, z-t) dt \end{cases} $$

However, I don't have sufficient knowledge in solving PDEs to move this system anyway forwards.

If it is not solvable in general, I would be interested in the solution for some example distributions, e. g. normal or Cauchy. Numerical solution would also be sufficient. Please note that we cannot assume the independence of $X$ and $Y$, since then $f_r$ uniquely determines all marginals. If some additional requirements on $f_1,f_2$ need to be assumed, it's allowed.

EDIT: thanks to the comment of @geetha290krm, we must assume that $\mathbb{E}_{f_2} (X) = 2\mathbb{E}_{f_1} (X)$

2

There are 2 best solutions below

3
On BEST ANSWER

It seems that even the assumption of $f_2$ having twice the mean of $f_1$ is far from sufficient. One can keep playing the game of matching moments: Assume for simplicity $f_1,f_2$ have mean $0$. Then since $$\text{Var}(X+Y)=\text{Var}(X)+\text{Var}(Y)+2\text{Cov}(X,Y)$$ we get the additional relation $$\int x^2 f_2(x)dx=2\int x^2 f_1(x)dx+2\int\int xyf_r(x,y)dxdy.$$ This relation does not necessarily hold under the given assumption. You can add this as an additional assumption, though repeating this argument with higher moments gives more and more technical extra conditions on $f_r$ in relation to $f_1,f_2$.

However, we may look at a few special cases: Suppose $f_1=\phi_{\mu_1,\sigma_1}$ and $f_2=\phi_{\mu_2,\sigma_2}$ are Gaussians. Then assume that $\mu_2=2\mu_1$. By the Ansatz $$f_r=\phi\bigg(\begin{pmatrix}m_1 \\ m_2\end{pmatrix},\begin{pmatrix}v_1 & \kappa \\ \kappa & v_2\end{pmatrix}\bigg)$$ we get since $f_r$ must have both marginals being $f_1$: $$m_1=m_2=\mu_1, \hspace{1cm} v_1=v_2=\sigma_1^2.$$ Now we see the problem I detailed above: If the variance $\sigma_1^2$ is small, then the variance of the sum cannot be too large. More precisely, let $X,Y$ have law $f_1$ and $X+Y$ have law $f_r$. Then we would necessarily have $$\sigma_2^2=2\sigma_1^2+2\kappa.$$ Altogether this would yield $$\kappa=\frac{\sigma_2^2-2\sigma_1^2}{2}.$$ However, since by the Cauch-Schwarz inequality we have for any random variables $X,Y$ that $$\bigg|\frac{\sigma_2^2-2\sigma_1^2}{2}\bigg|=|\kappa|=|\text{Cov}(X,Y)|\leq \sqrt{\text{Var}(X)\text{Var}(Y)}=\sigma_1^2,$$ we see that this choice of $\kappa$ cannot always be valid. However, if this inequality holds, then we are able to construct a joint Gaussian law $f_r$ with the desired properties, as detailed above.

(Note that in this case it was sufficient to enforce a condition on the second moments, since a Gaussian is uniquely determined by it's first two moments. In general, this is not the case, hence a condition on only the first two moments is not enough to allow such a construction of $f_r$)

I believe that one may be able to get similar characterizations for general infinitely divisible distributions. I will edit my answer to include such attempts when I can find the time :)

EDIT:

I have thought about the problem some more and I have found that a very similar approach works for Cauchy distributions: Let $f_1,f_2$ be Cauchy, meaning $$f_1(x)=\frac{1}{\pi\gamma_1\big(1+\gamma_1^{-2}(x-x_0^{(1)})^2)},\hspace{0.5cm}f_2(x)=\frac{1}{\pi\gamma_2\big(1+\gamma_2^{-2}(x-x_0^{(2)})^2)}$$ with associated characteristic functions $$\phi_1(\theta)=\exp(i\theta x_0^{(1)}-|\theta|\gamma_1), \hspace{1cm}\phi_2(\theta)=\exp(i\theta x_0^{(2)}-|\theta|\gamma_2).$$ Ansatz: Assume that $f_r$ is multivariate Cauchy (see here for a short introduction), meaning its characteristic function reads $$\phi(\theta)=\phi((\theta_1,\theta_2))=\exp\bigg(i\theta^T\mu - \big(\theta^T V\theta\big)^{1/2}\bigg)$$ The necessary condition $\phi((\theta,0))=\phi((0,\theta))=\phi_1(\theta)$ immediately implies $\mu_1=\mu_2=x_0^{(1)}$ as well as $$V_{1,1}=\gamma_1^2, \hspace{1cm} V_{2,2}=\gamma_1^2.$$ Furthermore, the condition $\phi((\theta,\theta))=\phi_2(\theta)$ implies $2x_0^{(1)}=x_0^{(2)}$ as well as (assuming $V$ is symmetric) $$2\gamma_1^2+2V_{1,2}=\gamma_2^2$$ which yields the choice $$V_{1,2}=V_{2,1}=\frac{\gamma_2^2-2\gamma_1^2}{2}.$$ Again, we need that $V$ is positve definite, hence there is some inequality constraint ensuring this.

0
On

It would be more reasonable to consider first the following problem: let $(a_0,\ldots,a_n)$, $(b_0,\ldots,b_m)$, $(c_0,\ldots,c_{n+m})$ be three probabilities. Does there exist a probability $(p_{ij})$ such that $$\sum_ip_{ij}=b_j,\ \sum_jp_{ij}=a_i\sum_{i+j=k}p_{ij}=c_k?$$ The set of solutions $p$ is a convex set and it is easy to describe its extreme points, but the real difficulty is to decide when it is not empty.