Infinitely differentiable approximations for an indicator function

657 Views Asked by At

Suppose I wanted to approximate the indicator function on $\mathbb{R}$, $f(x) = \mathbf{1}(x \in (-\infty, a])$ for $a \in \mathbb{R}$, particularly in the context of proving convergence in distribution holds if and only if convergence in expectation holds for every bounded, infinitely differentiable function. I need to show that for each $a \in \mathbb{R}, \delta > 0$ that there exists a smooth function $g$ which is $1$ if $x \leq a$ and $0$ when $ x > a+ \delta$, which means having all its derivatives being $0$ at both $a$ and $a+ \delta$.

I know how to solve this problem for approximating the indicator $\mathbf{1}(x \in (a,b))$ by taking, for example $$g_n(x) = \begin{cases} \exp \left(1/\left(n\left(x- \frac{b+a}{2}\right)^2 - \left(\frac{b-a}{2}\right)^2 \right)\right) & x \in (a,b) \\ 0 & \text{otherwise}\end{cases} $$

is infinitely differentiable and $g_n(x) \rightarrow \mathbf{1}(x \in (a,b))$ for every $x \in \mathbb{R}$

From the construction of $g_n$, we know that the problem I'm trying to solve is ok at one of the endpoints $a$ or $a + \delta$. How do I find a function satisfies the zero derivatives at both endpoints?

1

There are 1 best solutions below

1
On BEST ANSWER
  • Let $h:\Bbb{R} \to \Bbb{R}$ be defined by \begin{align} h(x) := \begin{cases} e^{-1/x} & \text{if $x>0$}\\ 0 & \text{if $x \leq 0$} \end{cases} \end{align} Verify for yourself (and draw a picture) that $h$ is $C^{\infty}$, $0 \leq h(\cdot) < 1$, and $h(x) > 0$ if and only if $x>0$.

  • For any $a,b \in \Bbb{R}$ with $a<b$, define $H_{a,b}: \Bbb{R} \to \Bbb{R}$ by \begin{align} H_{a,b}(x) := h(x-a) \cdot h(b-x) \end{align} Then, $H_{a,b}$ is $C^{\infty}$, $0 \leq H_{a,b}(\cdot) < 1$, and $H_{a,b}(x) > 0$ if and only if $a<x<b$ (draw a picture).

  • For any $a,b \in \Bbb{R}$ with $a<b$, define $G_{a,b}: \Bbb{R} \to \Bbb{R}$ by \begin{align} G_{a,b}(x) := \dfrac{\int_x^b H_{a,b}}{\int_a^b H_{a,b}} \end{align} Then, by the Fundamental Theorem of Calculus, we have $G_{ab}' = -\dfrac{H_{a,b}'}{\int_{a}^bH_{a,b}} \leq 0$; this shows $G_{a,b}$ is $C^{\infty}$ and (weakly) decreasing. Also, a direct calculation shows that if $x\leq a$ then $G_{a,b}(x)=1$ while if $x\geq b$ then $G_{a,b}(x)=0$. Also, by the derivative formula above, it follows that $G_{a,b}'(a)=G_{a,b}'(b)=0$ (but this is also obvious, because $G_{a,b}$ is smooth, and to the left of $a$, it is constant $1$, so the left-derivative is zero, but since $G_{a,b}$ is actually differentiable at $a$, the derivative is actually zero. Similar reasoning holds for $b$).

So, the function $g= G_{a,a+\delta}$ satisfies all the properties you're looking for.


Anyway, don't try to memorize all these functions. Try to understand them: the first function $h$ is very famous for being smooth (it's probably the only one you have to "memorize"), and it (or something very similar) is one of the building blocks for constructing smooth functions with such-and-such properties (like taking certain values, and decaying quick enough etc).

Next, $H_{a,b}$ is simply obtained from $h$ by translating, reflecting, and multiplication (to make it zero outside $(a,b)$). Finally, getting $G_{a,b}$ from $H_{a,b}$ is simply a matter of "normalization". Note that if you wanted a function which increases from $0$ to $1$ (as opposed to decreasing from $1$ to $0$ like $G_{a,b}$) all you have to do is consider the modified function $x\mapsto \dfrac{\int_a^x H_{a,b}}{\int_a^b H_{a,b}}$.