solution of initial value problem using laplace transformation

513 Views Asked by At

If $g(t)=0$ for $0\leq t<1$ and $g(t)=t^2-$1 for $t\geq 1$.Then find solution of initial value problem $y''+2y'+3y=g(t)$ and $y(0)=0,y'(0)=1$ using Laplace transformation.

What I tried: Taking Laplace transformation on both side

$\displaystyle L(y'')+2L(y')+3L(y)=L(g(t))$

$s^2Y(s)-sy(0)-y'(0)+2[sY(s)-y(0)]+3Y(s)=L(g(t))$

$s^2Y(s)-1+2sY(s)+3Y(s)=L(g(t))$

$$\Longrightarrow Y(s)=\frac{1+L(g(t))}{s^2+2s+3}$$

$\bullet$ If $g(t)=0$ for $0\leq t<1$. Then $L(g(t))=0$

Then $$Y(s)=\frac{1}{s^2+2s+3}=\frac{1}{(s+1)^2+2}$$

$\bullet$ If $g(t)=t^2-1$. Then $\displaystyle L(g(t))=\frac{2}{s^3}-\frac{1}{s}$

Then $$Y(s)=\frac{s^3-s^2+2}{s^2(s^2+2s+3)}$$

Is my process is right. If not then please explain me How do i solve it. Thanks

1

There are 1 best solutions below

5
On BEST ANSWER

What you did on the left side of the differential equation looks good to me but you made some mistakes on the right side. Why didn't you split the integral transform on the right side simply like this: $$\mathcal {L}(g(t))=\int_0^{\infty} g(t)e^{-st}dt$$ $$\mathcal {L}(g(t))=\int_0^{1} g(t)e^{-st}dt+\int_1^{\infty} g(t)e^{-st}dt$$ $$\mathcal {L}(g(t))=\int_0^{1} 0 \times e^{-st}dt+\int_1^{\infty} (t^2-1)e^{-st}du$$ $$\mathcal {L}(g(t))=\int_1^{\infty} (t^2-1)e^{-st}dt$$ Then substitute $u=t-1$ $$\mathcal {L}(g(t))=\int_0^{\infty} (u^2+2u)e^{-s(u+1)}du$$ $$\mathcal {L}(g(t))=e^{-s}\int_0^{\infty} (u^2+2u)e^{-su}du$$ $$\mathcal {L}(g(t))=e^{-s}\mathcal {L}\left (u^2+2u \right)$$ Finally: $$\mathcal {L}(g(t))=e^{-s}\left (\dfrac 2{s^3}+ \dfrac 2{s^2} \right)=\dfrac {2e^{-s}}{s^2}\left (\dfrac 1 {s}+ 1 \right)$$


You can also use the Heaviside step function to write $g(t)$ as: $$ \begin{align} g(t)&=(t^2-1)u(t-1) \\ g(t)&=((t-1+1)^2-1)u(t-1) \\ g(t)&=((t-1)^2+2(t-1))u(t-1) \\ g(t)&=(t-1)^2u(t-1)+2(t-1))u(t-1) \end{align} $$ Then apply the shift theorem: $$\mathcal {L}(u(t-a)f(t-a))=e^{-as}F(s)$$ So that we have: $$\mathcal {L}(g(t))=2e^{-s}\left ( \dfrac 1 {s^3}+\dfrac 1 {s^2} \right)$$