I'm using TensorFlow for some computations with complex variables (and derivatives of these computations). When I compute the derivative of (simple) holomorphic functions, the results obtained with TensorFlow are the conjugate of what I would expect. A simple example:
Given $z = x + yi$, and $f(z) = zz = x^2 + i2xy - y^2$.
We have $\frac{df}{dz} = \frac12 \left(\frac{\partial f}{\partial x} - i\frac{\partial f}{\partial y}\right) = 2x + i2y$.
Hence, for $z = \frac{1}{5}i$, $\frac{df}{dz} = \frac{2}{5}i$. However, using TensorFlow, I obtain $\frac{df}{dz} = -\frac{2}{5}i$. While searching for this, I found the following statement: "The gradient of a holomorphic function is the conjugate of its complex derivative." in a Github Issue, however, I don't understand why. In other words: Why is the gradient of a holomorphic functions equal to the conjugate of the complex derivative, and, where is the mistake in this simple example?
A holomorphic function $z\mapsto f(z)$ has no gradient, but at each point $z$ in its domain $\Omega$ a derivative $f'(z)\in{\mathbb C}$. In this way to $f$ a new function $f': \> z\mapsto f'(z)$ is associated, and it turns out that $f'$ is again holomorphic in $\Omega$.
Now you can write $f(x+iy)=u(x,y)+i v(x,y)$ with real-valued functions $u$ and $v$. In this way $f$ can be viewed as a vector-valued function $${\bf f}:\quad\Omega\to{\mathbb R}^2, \qquad (x,y)\mapsto\bigl(u(x,y),v(x,y)\bigr)\ .$$ This ${\bf f}$ does not have a gradient either, but at each point $(x,y)\in\Omega$ a Jacobian (or derivative) $d{\bf f}(x,y)$ whose matrix has the special form $$\left[\matrix{A&-B\cr B&A\cr}\right]\ ,$$ whereby the numbers $A$ and $B$ are related to the derivative of the holomorphic function $f$ we started with by $f'(x+iy)=A+iB$. This is the content of the so-called CR-equations.