Difficulty showing the general solution of $Y^{\prime} = AY$ is what is given.

355 Views Asked by At

Suppose $A$ is a $2\times 2$ matrix with the complex eigenvalues $\lambda = \alpha \pm i \beta$.

  • I already showed that $A = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix} V^{-1}$, where $V$ is an invertible matrix with columns $\mathbf{v}_{1}$ and $\mathbf{v}_{2}$. Then, I showed that $\mathbf{v}_{1} + i\mathbf{v}_{2}$ is an eigenvector of $\lambda = \alpha + i \beta$.
  • I then went on to show that $e^{At}=e^{\alpha t}V \begin{pmatrix} \cos \beta t & \sin \beta t \\ -\sin \beta t & \cos \beta t \end{pmatrix} V^{-1}$
  • Now, I need to use those things I just mentioned I already proved in order to show that the general solution of the system $$Y^{\prime}=AY$$ is $$Y_{g} = c_{1}Y_{1} + c_{2}Y_{2}$$ for any constants $c_{1}$ and $c_{2}$.

Additional information that I am given is that

  1. $Y_{1} = v_{1} \cos \beta t - v_{2}\sin \beta t = Re(z)$.
  2. $Y_{2}= v_{1}\sin \beta t + v_{2} \cos \beta t = Im(z)$
  3. $z = (v_{1} + iv_{2})e^{(\alpha + i\beta)t}$ is a complex valued solution of $Y^{\prime}=AY$

To this end, I simplified the expression for $z$ using Euler's formula to give me $$z = e^{\alpha t}\left[ \cos \beta t + i \sin \beta t\right](v_{1} + iv_{2}) \\ = e^{\alpha t} \left[ v_{1} \cos \beta t - v_{2} \sin \beta t\right] + i e^{\alpha t} \left[ v_{1} \sin \beta t + v_{2} \cos \beta t \right] $$

So, $Y_{1}$ and $Y_{2}$ appear to actually be $$Y_{1} = e^{\alpha t} \left[ v_{1} \cos \beta t - v_{2} \sin \beta t \right] = Re(z) $$ and $$ Y_{2} = e^{\alpha t}\left[ v_{1} \sin \beta t + v_{2} \cos \beta t\right] = Im(z) $$ rather than simply the $Y_{1} = v_{1} \cos \beta t - v_{2}\sin \beta t = Re(z)$ and $Y_{2}= v_{1}\sin \beta t + v_{2} \cos \beta t = Im(z)$ they are mentioned to be above.

Anyway, in order to show that the general solution is given by $Y_{g}=c_{1}Y_{1} + c_{2}Y_{2}$, we must show that both $Y_{1}$ and $Y_{2}$ satisfy the differential equation $Y^{\prime} = AY$, and that $Y_{1}$ and $Y_{2}$ are linearly independent.

So far, I'm just trying to show that $Y_{1}$ satisfies the equation, as in $Y_{1}^{\prime} = A Y_{1}$.

The left hand side isn't too hard (please check for mistakes!):

$$Y_{1} ^{\prime} = \alpha e^{\alpha t} \left( v_{1} \cos \beta t - v_{2} \sin \beta t\right) + e^{\alpha t} \left( - \beta v_{1} \sin \beta t - \beta v_{2} \cos \beta t\right) $$

Now, the REALLY hard part is getting $A Y_{1}$ to equal this left hand side.

I'm assuming that I have to apply the result that $A = V \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix} V^{-1} = \begin{pmatrix} v_{1} & v_{2} \end{pmatrix} \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix}\begin{pmatrix}v_{1} & v_{2} \end{pmatrix}^{-1} = \begin{pmatrix} v_{11} & v_{12} \\ v_{21} & v_{22} \end{pmatrix} \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix} \begin{pmatrix}\frac{v_{22}}{v_{11}v_{22} - v_{12}v_{21}} & \frac{-v_{12}}{v_{11}v_{22} - v_{12}v_{21}} \\ \frac{-v_{21}}{v_{11}v_{22} - v_{12}v_{21}} & \frac{v_{11}}{v_{11}v_{22} - v_{12}v_{21}} \end{pmatrix}$

But, multiplying out $$A\left[ e^{\alpha t}\left( v_{1}\cos \beta t - v_{2} \sin \beta t\right) \right] = \begin{pmatrix} v_{11} & v_{12} \\ v_{21} & v_{22} \end{pmatrix} \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix} \begin{pmatrix}\frac{v_{22}}{v_{11}v_{22} - v_{12}v_{21}} & \frac{-v_{12}}{v_{11}v_{22} - v_{12}v_{21}} \\ \frac{-v_{21}}{v_{11}v_{22} - v_{12}v_{21}} & \frac{v_{11}}{v_{11}v_{22} - v_{12}v_{21}} \end{pmatrix} \left[ e^{\alpha t}\left( v_{1}\cos \beta t - v_{2} \sin \beta t\right) \right] $$

is extremely difficult. I keep making mistakes, and have torn out 4 pages out of my notebook already where I've made mistakes and needed to go back to an earlier point in my simplification. Plus, with that inverse of $V$ in there, with the reciprocal of the determinant, I have no idea how I am going to get it to look like my left hand side, $Y_{1}^{\prime}$.

And this is just for $Y_{1}$! I can imagine that doing this for $Y_{2}$ is just as nightmarish.

My only other thought is that perhaps I am approaching this all wrong. Is there anything I can do to make verification of $Y_{1}^{\prime} = AY_{1}$ a less arduous task? Is the way through not the brute force, "multiplying-it-out" way I'm taking, but rather something more efficient and clever? If so, could you please let me know what it is?

If not, could you please assist somewhat in these calculations? At least with some benchmarks so I can figure out if I'm going off course as I do them?

Also, how would I go about showing that $Y_{1}$ and $Y_{2}$ are linearly independent, short of setting up a matrix, row-reducing, and introducing a whole bunch of other calculations?

Thank you for your time and patience!

1

There are 1 best solutions below

3
On BEST ANSWER

Since $\mathbf v_1+i\mathbf v_2$ is an eigenvector of $A$, so is $\mathbf z$, therefore $$A\mathbf z=(\alpha+i\beta)\mathbf z = e^{\alpha t}(\cos\beta t+i\sin\beta t)(\alpha+i\beta)(\mathbf v_1+i\mathbf v_2).$$ Also, because $A$ has real entries, for all $\mathbf v$, $A(\Re\mathbf v)=\Re(A\mathbf v)$ and $A(\Im\mathbf v)=\Im(A\mathbf v)$, which means that $$AY_1=\Re(A\mathbf z) = e^{\alpha t}(\alpha\mathbf v_1\cos\beta t - \beta\mathbf v_2\cos\beta t - \beta\mathbf v_1\sin\beta t - \alpha\mathbf v_2\sin\beta t) = \alpha Y_1-\beta Y_2$$ and $$AY_2 = \Im(A\mathbf z) = e^{\alpha t}(\alpha\mathbf v_2\cos\beta t + \beta\mathbf v_1\cos\beta t + \alpha\mathbf v_1\sin\beta t - \beta\mathbf v_2\sin\beta t) = \beta Y_1+\alpha Y_2.$$

On the other hand, $$Y_1' = e^{\alpha t}(-\alpha\mathbf v_2\sin\beta t + \alpha\mathbf v_1\cos\beta t - \beta\mathbf v_1\sin\beta t - \beta\mathbf v_2\cos\beta t) = \alpha Y_1-\beta Y_2$$ and similarly $$Y_2' = e^{\alpha t}(\alpha\mathbf v_1\sin\beta t + \alpha\mathbf v_2\cos\beta t - \beta\mathbf v_2\sin\beta t + \beta\mathbf v_1\cos\beta t) = \beta Y_1+\alpha Y_2.$$

Doing the matrix multiplications directly is feasible, too. You can simplify your calculations quite a bit by remembering that $V^{-1}\mathbf v_1=(1,0)^T$ and $V^{-1}\mathbf v_2=(0,1)^T$, so that $V^{-1}(c\mathbf v_1+d\mathbf v_2)=(c,d)^T$. Furthermore, the columns of the product of two matrices are linear combinations of the columns of the left-hand matrix, so you should be able to find $V$ times any vector without resorting to expanding $V$ into individual entries. So, $$AY_1 = e^{\alpha t}VCV^{-1}(\mathbf v_1\cos\beta t-\mathbf v_2\sin\beta t) = e^{\alpha t}VC\begin{bmatrix}\cos\beta t \\ -\sin\beta t\end{bmatrix} = e^{\alpha t}V\begin{bmatrix}\alpha\cos\beta t + \beta\sin\beta t \\ \beta\cos\beta t - \alpha\sin\beta t\end{bmatrix} = e^{\alpha t}(\alpha\cos\beta t + \beta\sin\beta t)\mathbf v_1 + e^{\alpha t}(\beta\cos\beta t - \alpha\sin\beta t)\mathbf v_2 = \alpha Y_1-\beta Y_2$$ as before.

To show the linear independence of $Y_1$ and $Y_2$, it suffices to show that neither is a scalar multiple of the other. The linear independence of $\mathbf v_1$ and $\mathbf v_2$ and the observation that $\Re(\mathbf z)$ and $\Im(\mathbf z)$ are both linear combinations of $\mathbf v_1$ and $\mathbf v_2$ might come in handy for this.