If $u(x) = \int_1^x \sin(x-t)t^2 dt$, then $u'' + u -x^2 = 0$

149 Views Asked by At

Suppose $u(x) = \int_1^x \sin(x-t)t^2 dt$, verify that $u''+u - x^2 = 0$.

I know how to verify the equation but I am curious if there is any faster way of doing this (since this is a practice problem for GRE Subject). The way I approached was to, first split the $\sin(x-t)$ term:

$$u(x) = \int_1^x \sin(x-t) t^2dt=\sin(x)\int_1^x\cos(t)t^2 dt-\cos(x)\int_1^x\sin(t)t^2dt$$

After that we apply Fundamental Theorem of Calculus twice and collect terms.

3

There are 3 best solutions below

0
On BEST ANSWER

A slightly faster approach would be to use the general Leibniz rule (see here), which quickly yields $u^{\prime\prime}(x)=x^2-u(x)$.

0
On

We can use the following form of Duhamel's principle, applied to constant coefficient equations:

Duhamel's Principle: Let $L$ be a constant coefficient differential operator of order $n$. Suppose $v:\mathbb{R}\rightarrow\mathbb{R}$ solves $Lv=0$, with $\frac{d^iv}{dx^i}(0) = 0$ for $0\le i < n-1$ and $\frac{d^{n-1}v}{dx^{n-1}}(0) = 1$. Then, for continuous $g:[a,b]\rightarrow\mathbb{R}$, the function $u:[a,b]\rightarrow\mathbb{R}$ defined by $$ u(x) = \int\limits_{a}^{x}{v(x-t)g(t)\text{ d}t} $$ satisfies $Lu = g$ on $[a,b]$.

This can be proven by using Leibniz's rule to differentiate the integral.

Noting that $v(x) = \sin x$ solves $v''+v=0$, with $v(0)=0$ and $v'(0)=1$, it follows that $u(x) = \int\limits_{1}^{x}{v(x-t)t^2\text{ d}t}$ solves $u''+u = x^2$, as desired.

How would one ever think of this? By trying to solve too many inhomogeneous ODEs.

0
On

It's similar to product rule:

$$u(x) = \int_1^x \sin(x-t)t^2 dt$$

$$u'(x) = \int_1^x \cos(x-t)t^2 dt + \sin(x-x)x^2 = \int_1^x \cos(x-t)t^2 dt$$

$$u''(x) = \int_1^x -\sin(x-t)t^2 dt + \cos(x-x)x^2 = \int_1^x -\sin(x-t)t^2 dt + x^2 = -u+x^2$$

QED