Showing that two given functions are harmonic

809 Views Asked by At

I'm preparing for my complex analysis midterm on Thursday and our professor gave us the following as a practice problem:

enter image description here

I'm a bit confused on how to approach part (a). Here's my train of thought: showing that $1/f$ is analytic will show that $u/(u^2+v^2)$ and $v/(u^2+v^2)$ are harmonic, per a theorem in our textbook. However, I'm unsure as to how I'm supposed to show that $f$ is analytic. Should I try to make use of the Cauchy Riemann equations in some way? But what about points where $u^2 + v^2 = 0$? That leads to division by zero in the expression for $1/f$.

1

There are 1 best solutions below

2
On

(a) $1/f$ is analytic at every point where $f$ is not zero. Because $f$ is analytic. Because $v$ is the conjugate harmonic to $u$.

With (d), it's easier to guess. You can recognize $x^2-y^2$ as the real part of $(x+iy)^2$, and $xy$ as $\frac12$ of its imaginary part. Generally: if it's a harmonic polynomial of degree $n$, then it somehow comes from $z^n$.