asymptotic behavior of the solution to an ODE

120 Views Asked by At

Given

$$y(t) = \frac{d_2 y_0 e^{d_2 t/\epsilon}}{d_2-\epsilon \, d_1 y_0 (e^{d_2 t/\epsilon}-1)}$$

I think that $y = O(1/\epsilon) $ as $\epsilon \to 0$. But as this is important for what I am doing and I am not used to asymptotic I wanted to double-check. Thx!

1

There are 1 best solutions below

2
On BEST ANSWER

(I'm assuming that $\epsilon > 0$.)

If $y_0 > 0$ then $y(t)$ has a singularity at

$$ T(\epsilon) =\frac{\epsilon}{d_2} \log\left(\frac{d_2+\epsilon d_1 y_0}{\epsilon d_1 y_0}\right). $$

For any fixed $t > 0$ it will eventually be true that $T(\epsilon) < t$ for $\epsilon$ small enough. That is, each fixed $t > 0$ will eventually leave the maximal interval of existence as $\epsilon \to 0^+$. If your application requires that you only consider $t$ in the maximal interval of existence then the asymptotics for fixed $t$ will be useless to you.

Note that there is no such singularity if $y_0 < 0$.

Asymptotics for fixed $t > 0$.

We have

$$ \begin{align} y(t) &= \frac{d_2 y_0 e^{d_2 t/\epsilon}}{d_2-\epsilon \, d_1 y_0 (e^{d_2 t/\epsilon}-1)} \\ &= \frac{d_2}{\epsilon d_1} \cdot \frac{1}{\epsilon^{-1} d_1^{-1} d_2 y_0^{-1} e^{-d_2 t/\epsilon} - 1 + e^{-d_2 t/\epsilon}} \\ &\sim -\frac{d_2}{\epsilon d_1} \end{align} $$

as $\epsilon \to 0^+$ since

$$ \lim_{\epsilon \to 0^+} \frac{1}{\epsilon^{-1} d_1^{-1} d_2 y_0^{-1} e^{-d_2 t/\epsilon} - 1 + e^{-d_2 t/\epsilon}} = -1. $$

Asymptotics in a subinterval of $[0,T(\epsilon))$ when $y_0 > 0$.

The singularity at $t=T(\epsilon)$ precludes the existence of an $O(f(\epsilon))$ bound which holds uniformly with respect to $t$ in the interval $[0,T(\epsilon))$, so if you're interested in such a bound then you must restrict $t$ to some proper subinterval of $[0,T(\epsilon))$.

One way to do this is to fix some $0 < \alpha < 1$ and consider $t \in [0,\alpha T(\epsilon)]$.

Since $y$ is increasing on $[0,T(\epsilon))$ we have

$$ y_0 \leq y(t) \leq y(\alpha T(\epsilon)) = \frac{1}{(\epsilon d_1 y_0/d_2)^{\alpha}(d_1 \epsilon+d_2/y_0)(1+\epsilon d_1 y_0/d_2)^{-\alpha} - d_1 \epsilon}. $$

Since $\alpha < 1$ we have

$$ \frac{1}{(\epsilon d_1 y_0/d_2)^{\alpha}(d_1 \epsilon+d_2/y_0)(1+\epsilon d_1 y_0/d_2)^{-\alpha} - d_1 \epsilon} \sim \frac{y_0}{d_2}\left(\frac{d_2}{\epsilon d_1 y_0}\right)^{\alpha} = O(\epsilon^{-\alpha}), $$

so that $y(t) = O(\epsilon^{-\alpha})$ as $\epsilon \to 0^+$ uniformly for $t \in [0,\alpha T(\epsilon)]$.

Uniform asymptotics when $y_0 < 0$.

The function $y(t)$ will be decreasing when $\epsilon < -\frac{d_2}{d_1 y_0}$, and since we're taking $\epsilon \to 0^+$ we'll assume that this is indeed the case. As such we have

$$ y_0 \geq y(t) \geq y(+\infty) = - \frac{d_2}{\epsilon d_1} $$

so that $y(t) = O(\epsilon^{-1})$ uniformly for $t>0$.

In summary,

  • For each fixed $t \geq 0$ we have $y(t) = O(\epsilon^{-1})$ as $\epsilon \to 0^+$.

  • If $y_0 > 0$ and $0 <\alpha < 1$ then $y(t) = O(\epsilon^{-\alpha})$ as $\epsilon \to 0^+$ uniformly with respect to $t$ on $[0,\alpha T(\epsilon)]$.

  • If $y_0 < 0$ then $y(t) = O(\epsilon^{-1})$ as $\epsilon \to 0^+$ uniformly with respect to $t$ on $\mathbb R^+$.