General solution to sytem of ODE's

126 Views Asked by At

I came across the following formula for the general solution of a system of first order ODE's of the form $x^{'}=Ax$

$\sum_{i}^{N} \sum_{j}^{m} b_{i,j}e^{\lambda_{i}t}(\sum_{k}^{m-1} \frac{t^{k}}{k!}(A-\lambda_{i})^{k})v_{i,j} $

Where $N$ is the number of distinct eigenvalues and $m$ each of their respective algebraic multiplicity.

It looks odd to me that any sum in $k$ is equally long, dosnt that imply that any chain and thus any block is of same size ?

1

There are 1 best solutions below

5
On BEST ANSWER

They're not the same length, because $m$ is not truly a constant. You said it yourself: $m$ is the multiplicity of a given eigenvalue $\lambda_i$, so its value is a function of $i$ (or of $\lambda_i$ depending on your perspective). For this reason it should be written as $m_i$ or $m(\lambda_i)$.

Edit: As you note, it is possible to make the sums over $k$ shorter by choosing the $v_{i,j}$ carefully (i.e. choosing elements of $\ker(A-\lambda_i)$ whenever possible, then elements of $\ker(A-\lambda_i)^2$, and so forth). It's not strictly necessary, however, and this form for the solution avoids getting into that. It trades simplicity of solutions for simplicity of presentation.

Basically you got confused here because you know too much linear algebra already :-)

Here's a full description of the solution for posterity:

  • The first summation, $\sum_{i=1}^N$, is over the distinct eigenvalues $\lambda_1$, $\ldots$, $\lambda_n$. (This may include complex eigenvalues.) Each $\lambda_i$ has a generalized eigenspace $V_i$, consisting of those vectors $v\in\mathbb R^n$ (or $\mathbb{C}^n$ if complex eigenvalues are necessary) where $(A-\lambda_i)^k v=0$ for some $k$.
  • The second summation, $\sum_{j=1}^{m(i)}$, is over any basis $\{v_{i,1}, \ldots, v_{i,m(i)}\}$ for $V_i$ (so $m(i) = \dim V_i$). Together the set of vectors $\{v_{i,j} : 1\le i\le n\textrm{ and }1\le j\le m(i)\}$ is a basis for all of $\mathbb{R}^n$ [$\mathbb{C}^n$].
  • The constants $b_{i,j}\in\mathbb{R}$ [$\mathbb{C}$] are chosen so that $x(0) = \sum_{i=1}^N \sum_{j=1}^{m(i)} b_{i,j}v_{i,j}$ (which can always be done uniquely; that's what it means for the $v_{i,j}$ to form a basis).
  • The third sum, $\sum_{k=0}^{m(i)-1} \frac{t^k}{k!} (A-\lambda_i)^k v_{i,j}$, is simply equal to $\exp((A-\lambda_i)t)v_{i,j}$. This works because $\exp((A - \lambda_i)t)v_{i,j}$ can be written as the infinite series $\sum_{k=0}^\infty \frac{t^k}{k!} (A-\lambda_i)^k v_{i,j}$, but all the terms where $k\ge m(i)$ vanish. (Side note: We know from the definition of generalized eigenspace that the terms of the series must vanish eventually, and in fact they must vanish by term $m(i)$. This can be seen by considering the sequence kernels of $(A-\lambda_i)^k$ as $k$ ranges over the positive integers; each is a subspace of the next, and once two adjacent terms are equal, all future terms must be as well. You might want to think through why those statements are true and why they imply the statement.)