Magnetic field at points on the circuit

35 Views Asked by At

I know magnetic field lines due to a circuit always form closed loops. Therefore $\nabla \cdot \vec{B}=0$ everywhere (even at points on the circuit). However due to singularity, magnetic fields are not defined at points on the circuit. Then how does it make sense to say $-$ divergence of "magnetic field at points on the circuit"?

1

There are 1 best solutions below

0
On

$\nabla\cdot\mathbf{B}=0$ is the differential form of Gauss's law for magnetism, which of course does not always hold unless $\mathbf{B}$ is differentiable. This differential form would fail in the cases where singularities appear, e.g., $\mathbf{B}$ is continuous but not differentiable, is discontinuous, and is divergent at a certain location.

Instead, the integral form of this law, i.e., $$ \oint_{\partial\Omega}\mathbf{B}\cdot{\rm d}\mathbf{S}=0, $$ always hold, as long as $\mathbf{B}$ is well-defined, or at least piecewise well-defined, on $\partial\Omega$.

The differential form is obtained from the integral form by divergence theorem, i.e., $$ \oint_{\partial\Omega}\mathbf{B}\cdot{\rm d}\mathbf{S}=\int_{\Omega}\nabla\cdot\mathbf{B}\,{\rm d}V, $$ for all domains $\Omega$. Since the left-hand side vanishes as per the integral form, the arbitrariness of $\Omega$ forces $\nabla\cdot\mathbf{B}=0$. Note that this derivation makes use of the divergence theorem. Therefore, when employing the differential form, one must make sure that $\mathbf{B}$ is such that this theorem applies. This is why divergence does not make sense when singularities occur: at these locations, the divergence theorem fails and one would no longer be able to obtain the differential form.