Using a power series method for proof of solution to first order matrix ODE

143 Views Asked by At

$$y' = A y$$

show the given solution to the ODE is of the form:

$$ e^{Ax}C=y(x)$$

I know how to use this equation, and have used it many times by applying the definition of a matrix exponential as well as Jordan normal forms in order to have an explicit solution. However, the derivation using power series solutions I have no idea where to start.

$\mathtt{Ideas:}$

$$ \mathtt{ansatz:} \sum_0^{\infty} (A_n x)^n = y(x)$$ where each term $A_n$ is a matrix??

1

There are 1 best solutions below

2
On BEST ANSWER

Initial condition: $y(0)=y_0$. I shall use $t$ instead of $x$ as the independent variable. We may first assume there exists a solution $y$ that is infinitely differentiable.

Then, $$ y'(0)=Ay_0,y''(0)=A^2 y_0,\ldots, y^{(n)}(0)=A^n y_0, $$ by repeatedly differentiating with respect to $t$. (You seem to mean that $A$ is constant.)

The Taylor series expansion of $y(t)$ at $t=0$ is, therefore $$ y=\sum_{n=0}^\infty \frac{(A^n y_0) t^n}{n!}=\exp(tA) y_0. $$ I have ignored the issue of convergence here - but it is very easy to see why it converges, and once we know it converges, it is a direct computation that $y$ does satisfy the differential equation. The solution is, indeed unique by a theorem of ODE (see Existence and uniqueness theorem). Therefore, all solutions to this DE is infinitely differentiable.