For two discrete random variables $X,Y$, it is known that their covariance is equal to $\text{Cov}(X,Y) = E(XY) - E(X)E(Y)$. Define a "variant" of the total variation distance in the following manner:
$$ \delta(X,Y) = \sup_{\alpha, \beta} \{| P(X = \alpha \text{ and } Y = \beta) - P(X = \alpha)P(Y = \beta)| .\} $$
Where $\alpha$ stands for the possible values of $X$ and $\beta$ stands for all possible values of $Y$.
Assume it is known that $X,Y \in [0,1]$. Is it true that $|\text{Cov}(X,Y)| \leq \delta(X,Y)$?
I tried to prove it "step by step". Clearly for independent random variables, both measures are $0$ so the condition holds. In addition,
$$ |\text{Cov}(X,Y)| = | \sum_{\alpha, \beta} \alpha \beta P(X= \alpha, Y = \beta) - \sum_{\alpha}\alpha P(X=\alpha) \sum_{\beta} \beta P(Y=\beta) |$$
And since $P(X= \alpha, Y = \beta) \leq \sup_{\alpha ', \beta '} P(\alpha ', \beta ') $ I thought to plug it in into the sum, and get the bound - but I cannot get rid of the $\alpha, \beta$. I thought that maybe because we know that $\alpha, \beta \leq 1$ it is possible to bound it, but I didn't find how. Any help will be appreciated
2026-04-03 22:55:38.1775256938
Relation between total variation and covariance
444 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
Take $X_n=Y_n$ together with probability $P(X_n=\frac{k}{n})=\frac1{n}$ for $k=1,\dots,n$.
Then $\delta(X_n,Y_n)=|\frac1{n}-\frac1{n^2}|$ so it will approach $0$ if we let $n$ grow.
$\text{Cov}(X_n,Y_n)=\text{Cov}(X_n,X_n)=\text{Var}(X_n)$ and will approach the positive variance of standard uniform distribution if we let $n$ grow.
So for $n$ large enough we will have $\delta(X_n,Y_n)<\text{Cov}(X_n,Y_n)$.