prove that every complex ODE of the form $f'(z)=P(z,f(z))$ has a unique solution for each initial value

99 Views Asked by At

I was given this exercise, in the topic of complex Taylor serieses:

Let $f_1, f_2$ be anayltic functions over an open connected domain $D \subseteq \mathbb{C}$. suppose that they are both solutions of the ODE $f'(z)=P(z,f(z))$ where $P$ is a two-variable polynomial.

if there is $z_0 \in D$ such that $f_1(z_0)=f_2(z_0)$, prove that $f_1(z)=f_2(z)$ for all $z\in D$.

My initial approach was to prove that for all $n$, $f_1^{(n)}(z_0)=f_2^{(n)}(z_0)$, and thus have a common taylor series. I tried using induction, but there doesn't seem a way to go beyond the first derivative of the ODE. am I in the right direction?

4

There are 4 best solutions below

0
On BEST ANSWER

If you are not to use analytical ODE theory, define a recursive sequence of polynomials $Q_k$ via $$ Q_0(z,w)=w\\ Q_{k+1}(z,w)=\frac{∂Q_k}{∂z}(z,w)+\frac{∂Q_k}{∂w}(z,w)·P(z,w) $$ and verify that $$ f^{(k)}(z)=Q_k(z,f(z)) $$ so that especially for the coefficients $a_k=\frac{f^{(k)}(z_0)}{k!}$ of the power series expansion in $z_0$ you get that all $$ a_k=\frac{Q_k(z_0,f(z_0))}{k!} $$ are completely determined as polynomial expressions in the initial point. As the series $f(z)=\sum_{k=0}^\infty a_k(z-z_0)^k$ converges per assumption, it is also unique.

1
On

You could just apply the local version of Picard-Lindelöf using that a local bound on the first derivative is a local Lipschitz constant, also called Cauchy-Lipschitz theorem.

Use it on the lines $t\mapsto z_0+tv$ with $|v|=1$ where the differential equation then reduces to a 2-dimensional (counting the real components) ODE $g(t)=f(z_0+tv)$, $g'(t)=Q(t,g(t))=\bar v P(z_0+tv,g(t))$.

From local uniqueness follows global uniqueness, on the lines and then also on the whole complex plane (for instance, by contradiction, lay a line through a point where the solutions are different).

0
On

If $$ P(z, w) = \sum_{j=0}^m \sum_{k=0}^n a_{jk} z^j w^k $$ then $$ P(z, w_1) - P(z, w_2) = \sum_{j=0}^m \sum_{k=1}^n a_{jk} z^j (w_1 - w_2)^k \\ = (w_1 - w_2) \sum_{j=0}^m \sum_{k=1}^n a_{jk} z^j \sum_{l=0}^{k-1} w_1^l w_2^{k-1-l} = (w_1 - w_2) Q(z, w_1, w_2) $$ for some polynomial $Q$ in three variables.

If both $f_1$ and $f_2$ are solutions of $f'(z)=P(z,f(z))$ in $D$ with $f_1(z_0) = f_2(z_0)$, then $$ f_1'(z) - f_2'(z) = (f_1(z) - f_2(z)) Q(z, f_1(z), f_2(z)) $$ If the difference is not identically zero then it has a zero of some multiplicity $k$ at $z_0$, and the above identity implies that $k - 1 \ge k$, which is a contradiction.

0
On

Your idea works. Prove by induction that $$ f^{n}(z)=\sum_{k=1}^{n-1}P_k^n(z,f(z))\,f^{(k)}(z), $$ where $P_k^n(z,w)$ is a polynomial (whose explicit formula we do not need to know.). In the following, $P=P(z,w)$.

  1. Case $n=1$: from the equation $f'(z)=P(z,f(z)$.
  2. Case $n=2$: $$ f''(z)=\frac{\partial P}{\partial z}(z,f(z))+\frac{\partial P}{\partial w}(z,f(z))\,f'(z). $$
  3. I leave the general case to you.