Total variation distance for sums of random variables

77 Views Asked by At

Consider two random variables $X_n \equiv X_n(\omega)$ and $Y_n \equiv Y_n(\omega)$ defined on the same probability space $(\Omega, \Sigma, P)$. Assume that $Y_n$ converges to zero in probability, i.e. $P(|Y_n|>\epsilon)=o(1)$ for all $\epsilon>0$. Denote $$ P_n(B):=P(Y_n +X_n \in B), \qquad Q_n(B):=P(X_n \in B), $$ for all Borel subsets of the real line. Can we conclude that the total variation distance between $P_n$ and $Q_n$ converges to zero, i.e. that $$ \sup_B|P_n(B)-Q_n(B)| \to 0 $$ as $n\to \infty$?

1

There are 1 best solutions below

0
On

I think you mean $Q_n=\mathbb P(X_n\in B)$, otherwise I don't see how you would expect this to be true.

Anyway I have a counterexample for both situations: simply take $Y_n=\frac1n$ and $X_n=1$.