Power series function expansion as solution for integral equation

438 Views Asked by At

I'm facing an integral equation whose unknown is a function $f(x)$:

The equation is of the kind:

$$ K = \int_{-l}^{l} G(x,s)f(s)ds $$

So it's a Fredholm integral equation that is rewritten in this way:

$$ K= f(x)\int_{-l}^{l}G(x,s)ds + \int_{-l}^{l} G(x,s)(f(s)-f(x))ds $$

The authors of a paper now employ an asymptotic expansion of the function $f(x)$ in power series of a small parameter $\epsilon$:

$$ f(x)= \epsilon f^{(1)}(x) + \epsilon^{2}f^{(2)}(x)+O(\epsilon^{3}) $$

How does a solution comes from this expansion? Now the unknown are $f^{(1)}(x)$ and $f^{(2)}(x)$.

Do you have any reference useful?

Does it mean that when I'm solving for $f^{(1)}(x)$ I have to consider only the terms of order $\epsilon$ And ignore all the others?

1

There are 1 best solutions below

2
On

This sort of asymptotic expansion is very common. If $a_0 + \epsilon a_1 + \epsilon^2 a_2 + \epsilon^3 a_3 + \ldots = 0$ (where $a_0, a_1, a_2, a_3, \ldots$ don't depend on $\epsilon$), you must have $a_0 = 0$, $a_1 = 0$, $a_2 = 0$, $a_3 = 0, \ldots$. So instead of one equation, you have a whole sequence of equations to solve. But this can be useful if each of those equations is easy to solve. Typically, each equation involves one more term of the unknown $f^{(i)}$ in this case, and hopefully it will be easy to solve for $f^{(i)}$ in terms of the previous $f^{(j)}$.