A functional equation with conditions

100 Views Asked by At

Consider the functional equation $$x \, f'(x) + a \, f(-x) = x.$$ It can be quickly determined that for $a = 0$ that $f(x) = x + c_{0}$ which yields a class of lines. When $a \neq 0$ an expansion of the form $$f(x) = \sum_{n=0}^{\infty} \alpha_{n} \, x^{n}$$ leads to $$f(x) = \frac{x}{1-a}.$$ This solution is not valid when $a=1$.

The question of this problem is: What are some methods to determine the solution of the functional equation $$x \, f'(x) + f(-x) = x \hspace{5mm}? $$

2

There are 2 best solutions below

0
On

This isn't general but you can rule out polynomial solutions apart from those that you found.

Let $P(x) = \sum c_j x^j$. Then $x P'(x) = \sum j c_j x^j$ and $a P(-x) = \sum a c_j (-1)^j x^j$.

Then the requirement $x P'(x) + a f(-x) = x$ gives the requirement $j c_j + ac_j (-1)^j = 1$ for $j = 1$ and $j c_j + ac_j (-1)^j = 0$ otherwise.

Solving for $j = 0$ gives $c_0$ in general. Solving for $j = 1$ gives $c_1 = \frac{1}{1-a}$. For the remaining $j$, we require either $c_j = 0$ or $j = -a(-1)^j$. Since this must hold for all $j$, we must have $c_j = 0$.

EDIT: Sorry, didn't notice your question at the end was slightly different.

Note that $f$ can not be analytic. If it is, then the $x$ term in the series representation of $x f'(x) + f(-x)$ is $f'(0) - f'(0) = 0$. But we must have $x f'(x) + f(-x) = x$.

I can glean very little information about this function, apart from $f(0) =0$. If the derivative is continuous at $0$ we have

$$f'(0) = \lim_{h \to 0} \frac{f(h) - f(-h)}{2h} = \lim_{h \to 0} \frac{-h(1-f'(-h)) + h(1-f'(h))}{2h} = \lim_{h \to 0} \frac{1}{2}(f'(-h)-f'(h))-1$$

so that $f'(0) = -1$. I'll update as I find a little more.

0
On

Interesting that this simple question has been left unanswered for so long. As noted in another answer, a full solution to the equation simply cannot be analytic around $x=0$; this means that if one assumes it holds in all of $\mathbb{R}$, one runs into contradictions. Let us assume for now that the equation holds in $\mathbb{R}\backslash \{0\}$. In this case the general solution to the equation may be constructed, first denoting

$$f(x)=\begin{Bmatrix}g(x)&,~~ x>0\\h(-x) &,~~x<0\end{Bmatrix}~~,~~g,h:\mathbb{R^+}\to\mathbb{R}$$

With this notation one sees that the two functions satisfy the system of coupled ODE's:

$$xg'(x)+ah(x)=x\\xh'(x)+ag(x)=-x$$

Then one can also easily show that the two functions satisfy the differential equations

$$x^2g''(x)+xg'(x)-a^2g(x)=x(1+a)\\x^2h''(x)+xh'(x)-a^2h(x)=-x(1+a)$$

and hence the general solution can be written as

$$g(x)=C_1x^a+C_2x^{-a}+\frac{x}{1-a}\\h(x)=-C_1x^a+C_2x^{-a}-\frac{x}{1-a}$$

for $a\neq 1$ and some constants $C_1,C_2\in \mathbb{R}$ to be determined by the initial conditions. More succinctly, the (weak) solution can be written as

$$f(x)=C_1 \text{sgn}(x)|x|^a+C_2|x|^{-a}+\frac{x}{1-a}$$

Having cleared up the details of the general solution for all $a\neq 1$, now the requested particular solution can be constructed in a variety of ways (Lagrange multipliers or defining an IVP for $a\neq 1$ and taking the limit $a\to 1$), which shows that for $a=1$ one can verify the following

$$f(x)=C_1 x+\frac{C_2}{|x|}+x\ln |x|$$

solves the ODE. Finally, note that the weak solutions we have constructed are also unique (by uniqueness for the linear system of ODE's) for any set of initial values given on $x_0\neq 0$.