Calderón-Zygmund $\times$ Schwartz $=$ Calderón-Zygmund

182 Views Asked by At

I am in a functional analysis class, and we are being asked to show that if $\eta$ is a Schwartz function and $K$ is a Calderón-Zygmund distribution, then their product is also a Calderón-Zygmund distribution.

A straightforward application of the product rule brings me maddeningly close to showing the differential inequalities:

\begin{align*} |\partial^\alpha(\eta k)(x)| &= \left|\sum_{\lambda\leq\alpha} \partial^{\lambda}\eta(x)\partial^{\alpha-\lambda} k(x) \right| \\ &\leq \sum_{\lambda\leq\alpha} |\partial^{\lambda}\eta(x)||\partial^{\alpha-\lambda} k(x)| \\ &\leq \sum_{\lambda\leq\alpha} |x^{\lambda}\partial^{\lambda}\eta(x)||x^{-\lambda}||\partial^{\alpha-\lambda} k(x)| \\ &\leq \sum_{\lambda\leq\alpha} ||\eta||_{|\alpha|}|x^{-\lambda}|c_{\alpha-\lambda}|x|^{-d-|\alpha|+|\lambda|} \\ &= \left(||\eta||_{|\alpha|}\sum_{\lambda\leq\alpha} c_{\alpha-\lambda}|x^{-\lambda}||x|^{|\lambda|}\right)|x|^{-d-|\alpha|} \end{align*}

(Triangle inequality / multiplication by $1$ / definition of Schwartz norm & application of differential inequalities for $k$ / rearrangement)

If I could somehow produce a bound on $|x^{-\lambda}||x|^{|\lambda|}$ then I would be done; but I don't think this is possible. Am I wrong? Or if not, is this approach salvageable in some other way?

Thank you :)

1

There are 1 best solutions below

1
On

It might help if you included your definition of a Calderón-Zygmund distribution, as many authors talk about Calderón-Zygmund kernels with varying conditions and then talk about their associated distributions. Given that you didn't specify the multi-index $\alpha$, it seems that your using the definition in Stein and Shakarchi Vol. 4:

  • $K$ is a distribution which coincides with a function $k\in C^{\infty}(\mathbb{R}^{n}\setminus\left\{0\right\})$ away from the origin.
  • $k$ satisfies the differential estimate $$\left|\partial^{\alpha}k(x)\right|\leq c_{\alpha}\left|x\right|^{-n-\left|\alpha\right|},\quad\forall \alpha$$
  • $K$ satisfies the following cancellation condition. For some fixed $n\geq 1$, there exists a constant $A>0$ such that for any $C^{\infty}$ function $\varphi$ supported in the unit ball and satisfying $$\sup_{\mathbb{R}^{n}}\left|\partial^{\alpha}\varphi\right|\leq 1,\quad\forall\left|\alpha\right|\leq n$$ one has $$\sup_{0<r}\left|\langle{K,\varphi_{r}}\rangle\right|\leq A, \quad \varphi_{r}(x):=\varphi(rx)$$ Stein and Shakarchi call the functions $\varphi$ $C^{(n)}$ -normalized bump functions.

One can prove that this definition is equivalent to $K$ coincides with a $C^{\infty}(\mathbb{R}^{n}\setminus\left\{0\right\})$ function, satisfies the differential estimate, and $\widehat{K}\in L^{\infty}$. We will use this equivalence to take care of the cancellation condition.

Let $\eta\in\mathcal{S}(\mathbb{R}^{n})$. Recall by definition of a Schwartz function, for any $N>0$, there exists a constant $C_{N}>0$ such that $\left|\partial^{\alpha}\eta(x)\right|\leq C_{N}(1+\left|x\right|)^{-N}$, for any multi-index $\alpha$. Evidently, $\eta K$ coincides with the $C^{\infty}$ function $\eta k$ on $\mathbb{R}^{n}\setminus\left\{0\right\}$. For the cancellation condition, the distributional convolution theorem gives $\widehat{\eta K}=\widehat{\eta}\ast\widehat{K}$, which is the convolution of an $L^{1}$ function and $L^{\infty}$ function. By Young's inequality, $\widehat{\eta}\ast\widehat{K}\in L^{\infty}$.

For the differential estimate, we use the Leibniz rule to obtain \begin{align*} \left|\partial^{\alpha}(\eta k)(x)\right|&=\left|\sum_{\beta\leq\alpha}{\alpha\choose\beta}(\partial^{\beta}\eta)(x)(\partial^{\alpha-\beta}\eta)(x)\right|\\ &\leq\sum_{\beta\leq\alpha}{\alpha\choose\beta}\left|(\partial^{\beta}\eta)(x)\right|\left|(\partial^{\alpha-\beta}k)(x)\right|\\ &\leq\sum_{\beta\leq\alpha}{\alpha\choose\beta}C_{N,\beta}(1+\left|x\right|)^{-N}\left|x\right|^{-n-\left|\alpha\right|+\left|\beta\right|}\\ &\leq C_{N,\alpha}\sum_{\beta\leq\alpha}(1+\left|x\right|)^{-N}\left|x\right|^{-n-\left|\alpha\right|+\left|\beta\right|},\quad\forall x\in\mathbb{R}^{n}\setminus\left\{0\right\} \end{align*} and for fixed $N>0$.

If $\left|x\right|\leq 1$, then $\left|x\right|^{-n-\left|\alpha\right|+\left|\beta\right|}\leq\left|x\right|^{-n-\left|\alpha\right|}$. So we note that $(1+\left|x\right|^{-N})\leq 1$ for all $x\in\mathbb{R}^{n}$ to get a constant $$\left|\partial^{\alpha}(\eta k)(x)\right|\leq\sum_{\beta\leq\alpha}C_{N,\alpha}{\alpha\choose\beta}\left|x\right|^{-n-\left|\alpha\right|}\leq C\left|x\right|^{-n-\left|\alpha\right|}, \quad\forall\left|x\right|\leq 1$$ where $C>0$ is a constant depending on $N$ and $\alpha$.

If $\left|x\right|\geq 1$, then $\left|x\right|^{-n-\left|\alpha\right|}\leq\left|x\right|^{-n-\left|\alpha\right|+\left|\beta\right|}$, so we see that we should take $N\geq\left|\alpha\right|$. Noting that $N-\left|\beta\right|\geq 0$, we get $$\left|\partial^{\alpha}(\eta k)(x)\right|\leq\sum_{\beta\leq\alpha}C_{N,\alpha}\left|x\right|^{n-\left|\alpha\right|-N+\left|\beta\right|}\leq C\left|x\right|^{-n-\left|\alpha\right|},\quad\forall\left|x\right|\geq 1$$