Necessary and Sufficient Condition for Analyticity of a Polynomial

3k Views Asked by At

Definition: A polynomial $P(x,y)$ will be called an analytic polynomial if there exist (complex) constants $a_k$ such that $$P(x,y) = a_0 + a_1 (x+iy) + a_2 (x+iy)^2 + ... + a_n(x+iy)^n$$

Definition: Let $f(x,y) = u(x,y) + iv(x,y)$, where $u$ and $v$ are real-valued functions. The partial derivatives $f_x$ and $f_y$ are defined by $u_x i v_x$ and $u_y + iv_y$, respectively, provided the latter exist.


Claim: A polynomial $P(x,y)$ is analytic if and only if $P_y = i P_x$.

I am having trouble understanding some proofs I read of the forward direction, that $P(x,y)$ being analytic implies $P_y = i P_x$. Most solutions I have encountered begin by writing $P(x,y) = a_0 + a_1 (x+iy) + a_2 (x+iy)^2 + ... + a_n(x+iy)^n$, which is innocuous enough, since this is the form an analytic polynomial must have. After this I get rather unsettled when I see written in these solutions $P_x(x,y) = a_1 + 2a_2 (x+iy) + ... + na_n(x+iy)^{n-1}$ and $P_y(x,y) = a_1 i + 2a_2 i (x+iy) + .... n a_n i (x+iy)^{n-1}$, and declare that $P_y = iP_x$.

I don't understand why we don't first have to split the polynomial into its real and imaginary parts, perhaps by using the binomial theorem (yes, that would be a mess), and then take the partial derivatives of $P(x,y)$ using the above definition. Why can the chain-rule be used so recklessly and indiscriminately with complex numbers?

The only way I could think of proving it is by first proving the following two claims (though they themselves may not be enough; I haven't fully worked out the details yet):

Lemma 1: Let $f(x,y) = a(x+iy)^n$. Then $f_x(x,y) = na(x+iy)^{n-1}$ and $f_y(x,y) = nai(x+iy)^{n-1}$


Lemma 2: Let $f(x,y) = u(x,y) + i v(x,y)$, $g(x,y) = u'(x,y) + i v'(x,y)$, and define $h(x,y) := f(x,y) + g(x,y)$. Then $h_x(x,y) = f_x(x,y) + g_x(x,y)$, and $h_y(x,y) = f_y(x,y) + g_y(x,y)$.

Proving the first lemma is to some degree messy, since it requires induction, the product rule, possibly the binomial theorem, etc. What am I missing? The original problem is so seemingly simple, but it appears that the author has left out a considerable number of details and background needed. Again, why can the chain-rule be used so recklessly and indiscriminately with complex numbers?

Am I misunderstanding something, or is Bak and Newman not a good text? It doesn't seem very rigorous, and it introduces analyticity in a rather odd way. I don't remember analyticity being introduced in this manner in Conway's book.

2

There are 2 best solutions below

4
On BEST ANSWER

Answer: A polynomial $P(z)$ is analytic if and only if $\partial_{\bar{z}}P(z) =0$

For instance: $p(z)= \bar{z}$ is not analytic (see below)

I thing Your problem can be more general.

In fact $f :\Omega \to \mathbb C$ be an holomorphic function on $\Omega\subset \mathbb C$.

Definition $f$ is holomorphic if at $a\in\Omega$ if the limit $$f'(a):= \lim_{z\to 0}\frac{f(a+z)-f(a)}{z}$$ exists.

Theorem Assume that $f$ is differentiable "in real analysis sens " (which actually the case for any polynomial) . Then, $f$ is holomorphic if and only if $ \partial_{\bar{z}}f \equiv 0$.

Corollary $f$ is holomorphic if and only if, $ f'(z)= f_x = if_y$ when $f(z,\bar{z})\equiv f(x,y) = u(x,y)+iv(x,y) $ with $z=x+iy$ and $u$ and $v$ are real-valued function

.

The case of Polynomial therefore become a particular case

Proof of the corolary First let emphasize that, $$ \frac{\partial }{\partial z} = \frac{1}{2}\left(\frac{\partial }{\partial x}-i\frac{\partial }{\partial y}\right)\qquad\text{and }\qquad \frac{\partial }{\partial \bar{z}}=\frac{1}{2}\left(\frac{\partial }{\partial x}+i\frac{\partial }{\partial y}\right).$$ Easy to check.

From the Theorem $f$ is holomorphic if and only if

\begin{split} \partial_{\bar{z}}f \equiv 0 &\Longleftrightarrow &\frac{\partial f }{\partial \bar{z}}=\frac{1}{2}\left(\frac{\partial f }{\partial x}+i\frac{\partial f}{\partial y}\right) =0 \Longleftrightarrow f_x= -if_y \end{split} Onther the other hand,

\begin{split} f'(z)= \partial_{z}f = \frac{1}{2}\left(\frac{\partial f }{\partial x}-i\frac{\partial f}{\partial y}\right) = f_x= -if_y. \end{split} Hence, $ f'(z) =f_x= -if_y.$

Observe that the actual statement is an equivalence. Therefore, a Polynomial $P$ is analitic if and if $P_x= iP_y.$

Now Let us prove the Theorem The differentiability of / is equivalent to $$ f(z+a)-f(a)= z \frac{\partial f }{\partial z}(a) +\bar{z}\frac{\partial f }{\partial \bar{z}}(a)+o(|z|).$$ Thus, $$f'(a):= \lim_{z\to 0}\frac{f(a+z)-f(a)}{z} = \lim_{z\to 0}[\frac{\partial f }{\partial z}(a) + \frac{\bar{z}}{z}\frac{\partial f }{\partial \bar{z}}(a)+o(1) ]$$ exists if and only if $$\lim_{z\to 0} \frac{\bar{z}}{z}\frac{\partial f }{\partial \bar{z}}(a) \qquad \text{exists}$$

which is possible only if $\frac{\partial f }{\partial \bar{z}}(a) = 0.$ Since the quantity $ \frac{\bar{z}}{z}$ is always of modulus $1$.

1
On

I think that the forward direction is the easy part of this theorem. You have to put yourself into the realm of complex-valued functions $f$ of the real variables $x$ and $y$. In this scenario all the rules of calculus remain valid, and in addition we have four rules of the form $${\partial\over\partial x}\bigl({\rm Re}(f)\bigr)={\rm Re}\left({\partial f\over\partial x} \right),\quad \ldots$$ which makes use of the fact that in the relevant difference quotients the denominator is real and that "limits are coordinatewise". In this world we have the special function $$z:\quad (x,y)\mapsto z(x,y):=x+iy\ .$$ A complex polynomial $P$ in the variables $x$ and $y$ is called analytic if it can be written as a composition $P=q\circ z$ for some complex polynomial $q$ in one variable: $$P(x,y)=\sum_{k=0}^n c_k(x+iy)^k=q\bigl(z(x,y)\bigr)\ .$$ From the chain rule it is then obvious that $$P_x=\sum_{k=0}^n c_k\, k(x+iy)^{k-1}{\partial(x+iy)\over\partial x}=q'(z)\cdot1\ ,$$ whereby $q'$ is a convenient notation for the function of $z$ appearing there. Similarly $P_y=q'(z)\cdot i$, so that indeed $P_y(x,y)\equiv i P_x(x,y)$.