If $T$ is a distribution on $\mathbb{R}$ such that $T(x^2 g)= 0$ for all $g \in \mathcal{D}(\mathbb{R}),$ then there exists some constants $a$ and $b$ such that $T=a \delta +b \delta'.$ (I am defining derivative of a distribution as $T'(f) =-T(f').$ I am stuck how to prove this.).
2026-04-24 12:52:37.1777035157
Question on Distribution
41 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
Let $f \in \mathcal{D}(\mathbb{R})$ and take $\rho \in \mathcal{D}(\mathbb{R})$ such that $\rho \equiv 1$ on a neighborhood of $\{0\}.$ Then let $$\hat{f}(x) := f(x) - \left( f(0) + f'(0) x \right) \rho(x).$$ We have $\hat{f} \in \mathcal{D}(\mathbb{R})$ with $\hat{f}(0) = 0$ and $\hat{f}'(0) = 0.$ Therefore, $\hat{f}(x) = \frac12 x^2 f''(0) + O(x^3)$ in a neighborhood of $x=0,$ so there exists $h \in \mathcal{D}(\mathbb{R})$ such that $\hat{f}(x) = x^2 h(x).$ Therefore, by the assumption on $T$ we have $T(\hat{f}) = 0.$ Thus, $$T(f) = T(\hat{f}) + f(0)\,T(\rho) + f'(0)\,T(x\rho) \\ = 0 + \delta(f)\,T(\rho) - \delta'(f)\,T(x\rho) \\ = a\,\delta(f) + b\,\delta'(f),$$ where $a = T(\rho)$ and $b = -T(x\rho).$