Understanding Complex Differentiation

1k Views Asked by At

I'm trying to understand complex differentiation, and I am confused about how the partial derivatives relate. First, a function $f:V\rightarrow \mathbb{C}$ on a open set $V$ of the complex plan is complex differentiable if the limit

$$\underset{h\rightarrow 0}{\lim}\frac{f(z+h)-f(z)}{h}$$ is exists for every point in $V$. $f$ can be written in the form $f(x+iy)=u(x,y)+iv(x,y)$ where $u$ and $v$ are the real and imaginary parts of $f$ respectively. If a function is complex differentiable, then it must satisfy the famous Cauchy-Riemann equations: $u_x=v_y$ and $u_y=-v_x$ where $$ u_x(x,y)=\underset{h\rightarrow 0}{\lim}\frac{u(x+h,y)-u(x,y)}{h}$$ My textbook (in the appendix) goes on to say that using the CR equtions, we see that $$f'=u_x+iv_x$$ I do not understand how to justify the equation above. $f'$ is the derivative of $f$ in terms of $z$, but the right side of the equation is in terms of the partial derivatives. How does one derive this formula? I know that a complex function is complex differentiable if and only if it is real differentiable and satisfies the CR equations and that if a complex function has continuous partial derivatives and satisfies the CR equations at a point is complex differentiable at that point.

3

There are 3 best solutions below

1
On

Writing your limits with an $h$ makes both equations look like the same limiting procedure. But there is a subtle difference: in your first equation we have $h \in \mathbb{C}$, whereas the second equation is real and hence $h \in \mathbb{R}$.

Consequently the first equation is a lot stronger than the second. Instead of approaching $0$ only from the left or right, $h$ can do all kinds of things in the complex plane. The existence of the limit in your first equation then means that it does not matter, which approach to zero we choose.

In particular we can be boring and go to zero along the real (or complex) axis. This boils down to restricting $h$ to be real in the first equation. In formulas, we obtain for a complex differentiable function \begin{align} f'(z) &= \lim_{\mathbb{C} \ni h \to 0} \frac{f(z+h)-f(z)}{h} \\ &= \lim_{\mathbb{R} \ni h \to 0} \frac{f(z+h)-f(z)}{h} \\ &= \lim_{\mathbb{R} \ni h \to 0} \frac{\big(u(z+h)+iv(z+h)\big)-\big(u(z)+iv(z)\big)}{h} \\ &= \lim_{\mathbb{R} \ni h \to 0} \frac{u(z+h)-u(z)}{h} + i\lim_{\mathbb{R} \ni h \to 0} \frac{v(z+h)-v(z)}{h} \\ &= u_x(z) + iv_x(z). \end{align}

Here we have used that taking limits is linear, the complex numbers directly decompose into real and imaginary part and that $f$ is continuously differentiable, i. e. the limits we took all exist.

0
On

I think you can find all ingredients in Ullrich's book on pages 4-6.

Ullrich explains that $f$ is complex differentiable at $z$ iff there exists $a \in \mathbb C$ such that $f(z+h) = f(z) +a \cdot h + o(h)$. Then $f'(z) = a$.

The map $L : \mathbb C \to \mathbb C, L(h) = a \cdot h = f'(z)\cdot h,$ is $\mathbb C$-linear and thus trivially also $\mathbb R$-linear.

Regarding $L$ as an $\mathbb R$-linear map, we can express it as a real matrix with respect to the basis $\{1, i\}$ of $\mathbb C$. If $a = \alpha + i\beta$, then this matrix is $$M(L) = \left( \begin{array}{rrr} \alpha & -\beta \\ \beta & \alpha \\ \end{array}\right) . $$ This is a well-known fact from linear algebra and can easily verified by writing $h = \mu + i\nu$ and comparing $M(L) \cdot \left( \begin{array}{rrr} \mu\\ \nu \end{array}\right)$ and $a \cdot h = (\alpha + i\beta)\cdot (\mu + i\nu)$.

Ullrich also shows that $f = u + iv$ is real differentiable which means that there exists an $\mathbb R$-linear map $L'$ such that $f(z+h) = f(z) + L'(h) + o(h)$. Expressing $L'$ as a real matrix yields $$M(L') = \left( \begin{array}{rrr} u_x(z) & u_y(z) \\ v_x(z) & v_y(z) \\ \end{array}\right) = \left( \begin{array}{rrr} u_x(z) & -v_x(z) \\ v_x(z) & u_x(z) \\ \end{array}\right)$$ due to the CR equations.

We clearly have $L' = L$. Comparing the matrices we see that $\alpha = u_x(z), \beta = v_x(z)$ which implies $$f'(z) = a = \alpha + i\beta = u_x(z) + iv_x(z).$$

PS. I think the direct proof given by Michael Heins is easier. Ullrich seems to focus on the relationship between complex and real differentiability.

0
On

You know that $f'(z)=$ $\left[\begin{array}{cc} u_x & -v_x \\ v_x & u_x \end{array}\right] $ then $f'(a,b)$ equal

$\left[\begin{array}{cc} u_x & -v_x \\ v_x & u_x \end{array}\right] \left[\begin{array}{c} a\\ b \end{array}\right]=\left[\begin{array}{c} au_x -bv_x\\ av_x+bu_x \end{array}\right]$ The book consider $a+ib = \left[\begin{array}{c} a\\ b \end{array}\right]$ And $f'(a,b)$ is going to be as I write earlier the matrix multiplication where the top entry is the real part and the bottom is the imaginary part and if you evaluate $(a+ib)(u_x+iv_x)$ and write that as a 2x1 matrix you will see it is the same