Joint distribution between a uniform random variable and a function which is "almost" independent from it

271 Views Asked by At

Motivation

Let $f(\cdot)$ be a (possibly randomized) function, such that for any random variable $X$ (taking values from a finite set $D$), $X$ and $f(X)$ are statistically independent. Let $U, U_1, U_2, \ldots$ be uniform and independent random variables over $D$.

Example: $f(X)=X \oplus U$. That is, $f$ picks a random value from $D$ and XORs it with a sample from its input (assuming XOR is well defined).

FACT: It can be easily proven that $(U_1, f(U_1))$ and $(U_2, f(U_3))$ are identically distributed.

Main Problem

In the main problem, we don't assume that $X$ and $f(X)$ are statistically independent. Instead, we assume that for any $X$, the mutual information between $X$ and $f(X)$ is very small. Formally:

Let $f(\cdot)$ be a (possibly randomized) function, such that for any random variable $X$ (taking values from a finite set $D$), we have $$I(X; f(X)) < \epsilon \enspace,$$ where $\epsilon$ is a (small) positive real value.

Can we prove a fact similar to above? That is, find some $\delta$ as a function of $\epsilon$, such that for all $(\alpha,\beta) \in D^2$:

$$\big| \Pr [(U_1, f(U_1)) = (\alpha,\beta)] - \Pr [(U_2, f(U_3)) = (\alpha,\beta)] \big| < \delta$$