two integral equations

98 Views Asked by At

I'm trying to solve the two following integral equations:

  1. $y(x)=2+\int_1^x\frac{1}{ty(t)}\ \mathrm dt$, $x>0$
  2. $y(x)=4+\int_0^x2t\sqrt{y(t)}\ \mathrm dt$

It really looks like an ODE, but I'm a bit clueless where to start to solve such. Any clue?

1

There are 1 best solutions below

0
On

You don't need to worry about additional assumptions on the function $ y $ (like its differentiability), as you've stated in a comment under the original post. All that is needed is somehow implicit in the statement of the problem. You need the following two lemmas.

  • The Fundamental Theorem of Lebesgue Integral Calculus: If $ g $ is a locally Lebesgue integrable function and $ f $ is defined with $ f ( x ) = \int _ a ^ x g ( t ) \ \mathrm d t $, then $ f $ is continuous. In fact, $ f $ satisfies a very much stronger property called absolute continuity. See here.
  • The First Fundamental Theorem of Calculus: If $ g $ is a continuous function and $ f $ is defined with $ f ( x ) = \int _ a ^ x g ( t ) \ \mathrm d t $, then $ f $ is differentiable and $ f ' ( x ) = g ( x ) $. See here.

In both given equations, the integrand on the right-hand side of the equation must be a locally integrable function (otherwise, the integral is undefined). Thus the function $ y $, which is the result of the integration in both equations, must be continuous. This, in both cases, shows that the integrand is continuous, and hence the result of the integration must be differentiable. Thus, any function $ y $ satisfying any of the given equations is in fact differentiable, and we can safely differentiate both sides of the equation, without losing any of the possible solutions. Let's do that for both cases.

  1. Differentiating both sides of the equation $$ y ( x ) = 2 + \int _ 1 ^ x \frac 1 { t y ( t ) } \ \mathrm d t \tag 0 \label 0 $$ we get $$ y ' ( x ) = \frac 1 { x y ( x ) } \text . $$ This can be rearranged to $$ y ' ( x ) y ( x ) = \frac 1 x \text , $$ or equivalently $$ \frac { \mathrm d } { \mathrm d x } \left( \frac 1 2 y ( x ) ^ 2 - \log x \right) = 0 \text . $$ Note that the given domain is $ x > 0 $, and thus $ \log x $ is defined. Consequently, there must be a constant $ c $ such that $$ \frac 1 2 y ( x ) ^ 2 - \log x = c \text . \tag 1 \label 1 $$ In particular, we must have $ c = \frac 1 2 y ( 1 ) ^ 2 $. But by \eqref{0}, we know that $ y ( 1 ) = 2 $, and hence $ c = 2 $. Hence we can rewrite \eqref{1} as $$ y ( x ) ^ 2 = 2 ( \log x + 2 ) \text . $$ But as $ y ( x ) ^ 2 \ge 0 $, this can only happen when $ \log x \ge - 2 $, or equivalently, $ x \ge \exp ( - 2 ) $. Therefore, there is no solution that is defined on all of the given domain ($ x > 0 $). Let's change the domain to $ x \ge \exp ( - 2 ) $. On this domain, we must have $ y ( x ) = \pm \sqrt { 2 ( \log x + 2 ) } $. But as the solution must be continuous and we have $ y ( 1 ) = 2 > 0 $, we must have $ y ( x ) = \sqrt { 2 ( \log x + 2 ) } $ for all $ x \ge \exp ( - 2 ) $. That's because if we have $ y ( x _ - ) < 0 $ at some point $ x _ - $, then by the intermediate value theorem, there must be a point $ x _ 0 $ between $ 1 $ and $ x _ - $ such that $ y ( x _ 0 ) = 0 $, which is impossible since $ y ( x ) = 0 $ can only happen at $ x = \exp ( - 2 ) $. It's straightforward to check that $ y ( x ) = \sqrt { 2 ( \log x + 2 ) } $ satisfies \eqref{0} for $ x \ge \exp ( - 2 ) $, and thus it is a solution, and the only one.
  2. Differentiating both sides of the equation $$ y ( x ) = 4 + \int _ 0 ^ x 2 t \sqrt { y ( t ) } \ \mathrm d t \tag 2 \label 2 $$ we get $$ y ' ( x ) = 2 x \sqrt { y ( x ) } \text . \tag 3 \label 3 $$ By \eqref{2} we have $ y ( 0 ) = 4 > 0 $. Thus, by continuity of $ y $, there is an open interval containing $ 0 $ on which $ y $ takes only positive values. So if we define $$ a = \inf \{ x \in \mathbb R | \forall y \in [ x , 0 ] \ f ( y ) > 0 \} \quad \text {and} \quad b = \sup \{ x \in \mathbb R | \forall y \in [ 0 , x ] \ f ( y ) > 0 \} \text , $$ then we will have $ a < 0 $ (possibly $ a = - \infty $) and $ b > 0 $ (possibly $ b = + \infty $). For $ x \in ( a , b ) $, we can rearrange \eqref{3} to get $$ \frac { y ' ( x ) } { \sqrt { y ( x ) } } = 2 x \text , $$ or equivalently $$ \frac { \mathrm d } { \mathrm d x } \left( 2 \sqrt { y ( x ) } - x ^ 2 \right) = 0 \text . $$ Consequently, there must be a constant $ c $ such that $$ 2 \sqrt { y ( x ) } - x ^ 2 = c \text , $$ and in particular $ c = 2 \sqrt { y ( 0 ) } = 4 $, which yields $$ y ( x ) = \left( \frac 1 2 x ^ 2 + 2 \right) ^ 2 \tag 4 \label 4 $$ for all $ x \in ( a , b ) $. \eqref{4} in particular shows that $ f ( x ) \ge 4 $ for all $ x \in ( a , b ) $, and this helps us show that $ ( a , b ) = ( - \infty , + \infty ) $. To see that, suppose $ a > - \infty $. Since $ y $ is continuous, there is $ \delta \in \mathbb R ^ + $ such that for all $ x \in ( a - \delta , a + \delta ) $ we have $ | y ( x ) - y ( a ) | < 2 $. By definition of $ a $, there is a point $ x _ - $ such that $ a - \delta < x _ - \le a $ and $ y ( x _ - ) \le 0 $. Again, by definition of $ a $ and the fact that $ ( a , b ) $ is nonempty, there is a point $ x _ + $ such that $ a \le x < \min ( a + \delta , b ) $ and $ y ( x _ + ) > 0 $. Since $ x _ - , x _ + \in ( a - \delta , a + \delta ) $, we have $ | y ( x _ - ) - y ( x _ + ) | \le | y ( x _ - ) - y ( a ) | + | y ( x _ + ) - y ( a ) | < 4 $. But as $ y ( x _ - ) \le 0 $ and $ y ( x _ + ) \ge 4 $, this leads to a contradiction, and thus we must have $ a = - \infty $. A similar argument shows that $ b = + \infty $, and hence \eqref{4} must hold on all of $ \mathbb R $. It's straightforward to check that this function indeed satisfies \eqref{2} and thus is a solution, and the only one.