Which of the following is a solution of $u(x) = x + \int_{0}^{x} (t-x)u(t)dt$??
(A) $\sin x$
(B) $x \cos x$
(C) $\ln (x+1)$
(D) $x e^{-x}$
(E) $xe^x$
Since all of the choices are twice differentiable, the first thing that came to mind my was to differentiate and then use Leibniz' rule; then differentiate again and then use the FTOC. Doing this gave me $\frac{u''(x)}{u(x)} = x-1$, but I don't remember how to solve differential equations. Does anyone know the solution to this?
Also, what is the more orthodox way of solving this? Keep in mind that this is a practice problem for the GRE, so it has to be done within two minutes, which is rather absurd in my estimation.
I also realized, after having already typed this up, that none of the choices appear to satisfy the differential equation. What is wrong with my method? I probably made a stupid calculation error, which I fear I may do quite a lot on the GRE (I hate these sorts of tests...)
EDIT: Here is my calculation. First, the partial with respect to $x$ of the integrand is $\frac{\partial}{\partial x} tu(t) - \frac{\partial}{\partial x} x u(t) = - u(t)$. Taking the first derivative, we get
$$u'(x) = 1 + (x-x)u(x) + \int_{0}^{x} (-u(t))dt= 1 - \int_{0}^{x} u(t)dt$$
Hence the second derivative is
$$u''(x) = -u(x)$$.
Quicker kill, with initial condition $u(0) = 0$: the Laplace transform.
Transforming both sides yields
$$ U(s) = \frac{1}{s^2} - \frac{U(s)}{s^2} $$
(I rewrote the integral as $-\int_0 ^x u(t)(x-t) \ dt $ to properly use the convolution formula.)
Now, rearrange to get
$$ U(s) \left( 1 + \frac{1}{s^2} \right) = \frac{1}{s^2} $$
and some division gives
$$ U(s) = \frac{1}{s^2 + 1} $$
giving the desired solution $u(x) = \sin x$.