Lipschitz constant after a Gaussian Convolution

1.2k Views Asked by At

Given a probability distribution $p(x)$ over the whole $\mathbb{R}$, that is $\forall x \in \mathbb{R} \quad p(x) > 0$, which has a finite Lipschitz constant $L$ we construct a "blurred" distribution using a Gaussian convolution. Assuming that $f(\epsilon)$ is a Gaussian probability density function with variance $\sigma^2$ the resulting distribution is: $$ \tilde{p}(x) = \int_{-\infty}^\infty p(x - \epsilon) f(\epsilon) d \epsilon$$

Intuitively it seems that the Lipschitz constant of $\tilde{p}(x)$ is going to be smaller compared to $L$ (you can prove this mathematically). My question is for a general distribution $p(x)$ can we state something about the Lipschitz constant of $\tilde{p}(x)$ as a function of $L$ and $\sigma^2$ rather than just the naive bound?

1

There are 1 best solutions below

0
On

My opinion is as follows for your reference.

Suppose $|\frac{d}{dx}f_0(x)| \le L$, and that there exists Fourier transformation for $f_0(x)$, $f(x) = \int f_0(x - t) N(t; 0, \sigma^2) ~dt$ is its convolution with Gaussian, then $f(x)$ is $L'$ Lipschitz bounded, and $L' \le L$.

Proof: By Fourier transformation, $f_0(x)$ can be expressed as $$ f_0(x) = \int F_0(\omega) \exp\{j\omega x\} ~d\omega $$ where $F_0(\omega) = \mathcal{F}[f_0(x)]$. The Fourier transformation of the Gaussian distribution is $$ \mathcal{F}[N(t; 0, \sigma^2)] = \exp\{-\frac{\sigma^2\omega^2}{2}\}. $$ Thus, we have $$ \frac{d}{dx}\int f_0(x - t) p(t) ~dt = \int F_0(\omega) \int \frac{d}{dx}\exp\{j\omega (x-t)\}p(t) ~dt ~d\omega. $$ We can reduce the inner integral part by $$ \int \frac{d}{dx}\exp\{j\omega (x-t)\}p(t) ~dt = j\omega \int \exp\{j\omega (x-t)\}p(t) ~dt $$ in which the integral part is the convolution of $\exp\{j\omega x\}$ with the Gaussian. In frequency domain, convolution becomes multiplication, and $$ \mathcal{F}[\exp\{j\omega_0 x\}] = 2\pi \delta(\omega - \omega_0) $$ where $\delta(\cdot)$ denotes Dirac function. Therefore, we have $$ |\int \frac{d}{dx}\exp\{j\omega (x-t)\}p(t) ~dt| = |j\omega \exp\{-\frac{\sigma^2\omega^2}{2}\} \exp\{j\omega x\}|. $$

Thus, the norm of the derivative of $f(x)$ is given by $$|\frac{d}{dx}\int f_0(x - t) p(t) ~dt| = |\int \omega\exp\{j\omega x\} F_0(\omega)\exp\{-\frac{\omega^2\sigma^2}{2}\}~d\omega|.$$ Using the bound of the derivative of $f_0(x)$, we have $$ |\frac{d}{dx}f_0(x)| = |\int \frac{d}{dx}\exp\{j\omega x\}F_0(\omega)~d\omega| = |\int \omega\exp\{j\omega x\} F_0(\omega)~d\omega| \le L. $$ Because $\exp\{-\frac{\omega^2\sigma^2}{2}\} \le 1$, we can obtain $$|\int \omega\exp\{j\omega x\} F_0(\omega)\exp\{-\frac{\omega^2\sigma^2}{2}\}~d\omega| \le L. $$