For $\frac{1}{a^2+x^2}, \exists K>0$ such that $|f(x)|+|f'(x)| \leq \frac{K}{1+x^2} \; \forall x \in \mathbb{R}$

77 Views Asked by At

Given $a>0$, for $f(x) = \frac{1}{a^2+x^2}, \exists K>0$ such that $|f(x)|+|f'(x)| \leq \frac{K}{1+x^2} \; \forall x \in \mathbb{R}$

This is a property that my pde book used without proving. It is supposed to be true, but I´m failing to verify.

I got:

$f’(x) = \displaystyle \frac{-2x}{(a^2+x^2)^2}$, then $|f(x)|+|f'(x)| = \frac{1}{a^2+x^2} + \frac{2|x|}{(a^2+x^2)^2} = \frac{a^2+2|x|+x^2}{(a^2+x^2)^2} $. What can be this constant $K$?

Exercise photo from the book:enter image description here

It says: Apply the above formula (i) to the function $f(x) = (a^2+x^2)^{-1}$ to obtain the series below. It works when I apply the function, but why does this hypothesis of (i) holds for this $f$?

1

There are 1 best solutions below

1
On BEST ANSWER

Let $g(x):=(1+x^2)(|f(x)|+|f'(x)|)$. Since $\lim_{x\to\infty} g(x)$ and $\lim_{x\to-\infty} g(x)$ are finite and $g$ doesn't have any vertical asymptote we conclude that there exists $K=K(a) > 0$ such that $$g(x) \leq K \quad \forall x \in \mathbb R,$$ or equivalently $$ |f(x)|+|f'(x)| \leq \frac{K}{1+x^2} \quad \forall x \in \mathbb R. $$