Solving a system of differential equations with the eigenvector but no eigenvalues?

165 Views Asked by At

Solving a system of differential equations by substitution

I originally asked for help with this question which ended up just being me missing a simple identity. I've moved on and am stuck again, so hopefully you all can be as helpful as you were previously.

Define the vectors $\vec y$ = [$y_1$, $y_2$], $\vec y$' = $\left[ \frac{dy_1}{dt}, \frac{dy_2}{dt}\right]$ and the coefficient matrix A = $\begin{bmatrix} a & -b\\ b & a\end{bmatrix}$.

Then, $\vec y$' = Ay defines a system of linear homogeneous differential equations. Note that A has complex eigenvalues.

Hint: To solve the system of equations, write $z = y_1 + iy_2$ and calculate $z'$. For some $u$ defined in terms of $a$ and $b$ you should find that $z'=uz$, which is easily solved. Now, write the solution in terms of the original variables $y_1$ and $y_2$. You will need to use the identity $e^{itb}=\cos tb+i\sin tb.

  1. Solve the system of equations
  2. Locate all equilibrium points, and provide a stability analysis
  3. Given your answers to 1. and 2., sketch the phase plane for this system in $\mathbb{R}^2$.

I've since found that since $z' = uz = (a + ib)(y_1 + iy_2)$, we can say that $z = e^{(a+ib)t} = e^{at}e^{ibt} = e^{at}\cos(bt) + ie^{at}\sin(bt)$

From this we can go back to our original vector $\vec{y}$ and say: $\vec{y} = \begin{bmatrix} e^{at}\cos(bt)\\ e^{at}\sin(bt) \end{bmatrix}$

Of course we now want to solve the system, so I observe that for some system of differential equations we have that:

$\vec{x}(t) = \vec{\eta}e^{\lambda t}$

As such I feel that we have:

$\vec{y} = \begin{bmatrix} \cos(bt)\\ \sin(bt) \end{bmatrix}e^{at}$

I'm concerned that here I am making a mistake or am getting myself confused. How do I find the eigenvalues for this system?

If I pursue this thread, I end up with the typical steps for finding eigenvalues and eigenvectors, where:

$\det(A-\lambda I) = 0$

But above I have that $a$ is $\lambda$, so I end up solving the Wronskian such that:

$\begin{vmatrix} a-\lambda & -b\\ b & a - \lambda \end{vmatrix} = \begin{vmatrix} a-a & -b\\ b & a - a \end{vmatrix} = \begin{vmatrix} 0 & -b\\ b & 0 \end{vmatrix} = b^2 = 0$

Now I'm spinning around and around and the anxiety has me frozen. Any advice at this point would again be greatly appreciated!

EDIT: Okay - so in the case where $u = (a+ib)$ and its conjugate $\bar{u}$ are my eigenvalues, I have that:

$\begin{bmatrix} a - (a-ib) & -b\\ b & a-(a-ib) \end{bmatrix} \begin{bmatrix} \eta_1\\ \eta_2 \end{bmatrix} = \begin{bmatrix} 0\\0\end{bmatrix}$

This gives:

$ib\eta_1 - b\eta_2 = 0$ and $b\eta_1+ib\eta_2 = 0$

which gives that $\vec{\eta^{(1)}} = \begin{bmatrix} -i\\1\end{bmatrix}$, and similar for $\vec{\eta^{(2)}}$

1

There are 1 best solutions below

3
On

The eigenvalue decomposition serves to decompose the system into scalar components, real scalar or complex scalar. The eigenvalues are then the coefficients for these scalar equations.

Here from the start you have a transformation into a complex scalar equation. Its complex conjugate is likewise a scalar transformation of the system. The scalar coefficients $a\pm ib$ are then the eigenvalues.