Convex combination of random variables

554 Views Asked by At

Let $X, Y$ two independent real random variables. If I define the distance of the distribution (i.e. the cumulative density function) of $X$ with a target distribution $F$ as $$\sup_t |F_X(t)-F(t)|$$ and the same for $Y$ with another distribution $G$, $$\sup_t |F_Y(t)-G(t)|$$ can I say that $$\forall \alpha \in (0,1)\qquad \sup_t |F_{\alpha X+(1-\alpha )Y}(t)-H_\alpha(t)|\le max (\sup_t |F_X(t)-F(t)|, \sup_t |F_Y(t)-G(t)|)$$ where $H(t)$ is defined as the distribution of a r.v. which is the same convex combination of two independent random variables of distribution $F(t),H(t)$ respectively?

Intuitively, the anwer seems yes: If the distribution of $X$ approximates $F$ and the one of $G$ does the same with $Y$, any convex combination of the two should have a distribution which is near to $H_\alpha$. However, the proof is not staightforward as one expects:

First, I tried to pass to the densities $f_X, f_Y$, so that the density of a convex combination of the two becomes $$\alpha(1-\alpha)f_X(\alpha x)*f_Y\big((1-\alpha)x\big)(t)$$ then I tried to recover something about the distribution function, but I've been buried by the computation, without recovering anything good. Does anyone have a better idea to solve it?