Oscillation inequality in $B(0,1)\subset\mathbb{R}^2$

81 Views Asked by At

Let $u:\mathbb{R}^2\to\mathbb{R}$ be in $C^1(B_1)$ where $B_r$ is the ball centered at the origin of radius r. Let $\omega(r)$ be the oscillation of $u$ on the boundary of $B_r$ and assume that $\omega$ is increasing. Prove that $$ \omega(r)^2\leq \frac{\pi}{\log(1/r)}\int_{B_1}|Du|^2dx $$

To be honest I am sure where to start with this. I've looked at the right side (without the constant) and computed for $\epsilon>0$ $$ \int_{B_{1-\epsilon}}|Du|^2\;dx\geq \bigg(\int_{B_{1-\epsilon}}Du\;dx\bigg)^2 =\bigg(\int_0^{1-\epsilon}\int_{\partial B_r}Du\;dS\;dr \bigg)^2 =\bigg((1-\epsilon)\int_{\partial B_r}Du\;dS\bigg)^2 $$

But I don't know where to go from here. For context, this is step 1 of a proof of Harnack's Inequality on $\mathbb{R^2}$ In addition to searching here, I could not find any internet resources with this specific inequality nor a proof of Harnack's using the oscillation of a $C^1$ function.