A distribution $D$ over $\Lambda$ has $k$ min-entropy if the largest probability mass given to any element in $\Lambda$ is $2^{-k}$ (i.e., for all $a\in\Lambda$, $D(a)\leq 2^{-k}$, and for some $a$ it is $2^{-k}$). We denote it by $H_\infty(D)=k$.
Let $X$ and $Y$ be two independent distributions over $\{0, 1\}^n$ such that both $X$ is $\epsilon$-close in statistical distance to a distribution with $k$ min-entropy and $Y$ is $\epsilon$-close in statistical distance to a distribution with $k$ min-entropy.
Let $Z=X+Y$ denote the distribution over $\{0, 1\}^n$ obtained by sampling $x\sim X$ and $y\sim Y$ and outputting $x+y$, where the sum is addition modulo $2$ coordinate wise.
Prove that $Z$ is $\epsilon^2$-close in statistical distance to a distribution with min-entropy at least $k$.
It is clear that $H_\infty(Z)\leq\min\{H_\infty(X),H_\infty(Y)\}$, but I don't see how that helps.
Note that by statistical distance I mean TV distance.