Calculating the divergence of electric field in standard coordinates

45 Views Asked by At

Given an electric field $$ \vec{E(r)} = (c/r^2 ) \hat {r} $$ I want to show that $ \nabla \cdot \vec{E} = 0$ for $ r \ne 0 $ and do the calculation in standard coordinates. For simplicity I'll assume 2 dimensional problem namely $ r^2 = x^2 + y^2 $

I have $ \vec{E(r)} = (c/r^2 ) \hat {r} = cx/r^3 \hat{x} + cy/r^3 \hat{y} = cx (x^2 + y^2)^{-1.5} \hat x + cy(x^2 + y^2)^{-1.5} \hat y $

But now the calculation of

$ \nabla \cdot \vec{E} \ne 0 $

Can someone help me understand what am I doing wrong ?

1

There are 1 best solutions below

4
On BEST ANSWER

The inverse square law comes about because in three dimensions the surface of a sphere grows as the square of the radius. If you do the same calculation in three dimensions it will work out. In two dimensions the field has to fall off as $\frac 1r$ to have zero divergence.