How to solve a line integral when one of the points is not in the domain?

101 Views Asked by At

Consider the vector field $\mathbb{R}^2\setminus\{0\}$,

$F = \frac{2x}{\sqrt{(x^2 + y^2)}}\hat{i} + \frac{2y}{\sqrt{(x^2 + y^2)}}\hat{j}$

Compute the integral:

$\int_{C}^{}F\cdot{dr}$

for the curve:

$x(t) = t^3 + 1, y(t) = (1 - t^2)e^{2t}$ , $t\in{[-1, 1]}$

So, the vector field is conservative, except at the point $(0,0)$ , but this curve contains the point $(0,0)$. Does that mean that I have to use brute force to do this and cannot use the fundamental theorem of line integrals?

Thanks!

1

There are 1 best solutions below

0
On BEST ANSWER

You can consider the limit of the line integral as its start goes to $t=-1$. Define $r(t)=(x(t),y(t))$. Then $$\int_CF\cdot dr=\lim_{a\to-1^+}\int_a^1F(r(t))\cdot r'(t)\,dt=\lim_{a\to-1^+}(G(r(1))-G(r(a))=2\sqrt{2^2+0^2}-2\sqrt{0^2+0^2}=4$$ where $G$ is the scalar field $2\sqrt{x^2+y^2}$ whose gradient is the stated vector field.