Inverse polynomial decay of arbitrary derivatives of certain functions

24 Views Asked by At

In Sogge's Lectures on Nonlinear Wave equations, Chapter II, Section 1, Proposition 1.1, is about expressing the radial vector field $\partial_r$ as a linear combination of conformal Killing fields in Minkowski spacetime. It is claimed that $$ (t - r)\partial_r = a_0(t, x)S + \sum_{i = 1}^n a_i(t, x) \Omega_{0i}, $$ where $S = x^\mu \partial_\mu$ is the scaling field and $\Omega_{0i} = x_0 \partial_i - x_i \partial_0$ is a Lorentz boost. The exact form of the $a_\mu$ is $$ a_0(t, x) = \frac{-r}{r + t}, \ \ \ \ a_i(t, x) = \frac{tx_i}{r(r + t)}. $$ Here $r = |x|$.

Question: It is claimed that the $a_\mu$ satisfy bounds of the form: $$ |\partial^\alpha a_\mu(t, x)| \leq C_\alpha (t + |x|)^{-|\alpha|} $$ for all $\alpha$ if $|x| > \delta t$ for some fixed $\delta > 0$. How would one go about proving this? It seems like one must find some inductive formula for the derivatives, but this quickly gets incredibly messy. Is there a simpler way of going about this? (One possibility might be exploiting the radial symmetry, although $a_i$ is not completely radially symmetric?) Is there a standard reference for this type of inverse polynomial bounds on derivatives?