will in any case $\frac{\partial f}{\partial x}$ is Reciprocal of $\frac{\partial x}{\partial f}$

143 Views Asked by At

If $y$ is a Single Variable function of $x$ we have:

$$\frac{dy}{dx}=\frac{1}{\frac{dx}{dy}}$$ Now coming to partial Derivatives i have tested for Polar Coordinates

$$x=r \cos t$$ $$y=r \sin t$$

We have: $$\frac{\partial x}{\partial r}=\cos t=\frac{x}{\sqrt{x^2+y^2}}$$

Where as: $$\frac{\partial r}{\partial x}=\frac{\partial \left(\sqrt{x^2+y^2}\right)}{\partial x}=\frac{x}{\sqrt{x^2+y^2}}$$

Which are actually same?

So is it not true about reciprocal relation in Partial derivatives or in which situations: $$\frac{\partial f}{\partial x}=\frac{1} {\frac{\partial x}{\partial f}}$$

2

There are 2 best solutions below

0
On

You have points $P$ in the plane. You can identify such points by means of cartesian coordinates $(x,y)$, or polar coordinates $(r,\phi)$, or some other coordinates $(u,v)$. In any case, such $r$ is not a function of $x$, nor is $x$ a function of $r$. In reality $r$ depends in a particular way on $x$ and $y$ together, and similarly $x$ depends in a particular way on $r$ and $\phi$ together. The partial derivative ${\partial r\over\partial x}$ only makes sense when it is agreed that the companion variable to $x$ is the usual $y$, and in this case from $r=\sqrt{x^2+y^2}$ it follows that $${\partial r\over\partial x}={x\over\sqrt{x^2+y^2}}\ .\tag{1}$$ Conversely, the partial derivative ${\partial x\over\partial r}$ only makes sense when it is agreed that the companion variable to $r$ is the polar angle $\phi$, and in this case from $x=r\cos\phi$ it follows that $${\partial x\over\partial r}=\cos\phi\ .\tag{2}$$ It is a "coincidence" that for corresponding pairs $(x,y)$/$(r,\phi)$ the RHSs of $(1)$and $(2)$ have the same value. This "coincidence" can be explained by looking at the Jacobian matrices of $(x,y)\leftrightarrow(r,\phi)$.

0
On

The result $(f^{-1})'(x)=\frac{1}{f'(x)}$ is true if $f(x)$ is a continuously differentiable function with non null derivative at $x$, and is proved using the fact that the derivative is the best linear approximation of the function at any point.

An analogous result is true also for functions of more variables, but only if we use the same conceptual definition of the derivative, i.e. the '' best linear approximation '' of the function, that is the total derivative. But it is not true for the partial derivatives.

In your case, for the function $(x,y)=(r\cos t, r\sin t)$ the total derivative is the Jacobian matrix: $$ J= \begin{pmatrix} \frac{\partial x}{\partial r} & \frac{\partial x}{\partial t}\\ \frac{\partial y}{\partial r} & \frac{\partial y}{\partial t} \end{pmatrix}= \begin{pmatrix} \cos t & -r\sin t\\ \sin t & r \cos t \end{pmatrix} $$ that (for $r \ne 0$) is invertible, and the inverse matrix is:

$$ J^{-1}= \begin{pmatrix} \cos t & \sin t\\ \frac{-\sin t}{r} & \frac{ \cos t}{r} \end{pmatrix}= \begin{pmatrix} \frac{x}{\sqrt{x^2+y^2}} & \frac{y}{\sqrt{x^2+y^2}}\\ \frac{-y}{x^2+y^2} & \frac{ x}{x^2+y^2} \end{pmatrix} $$ and you can prove that this is the total derivative (i.e. the Jacobian matrix) of the inverse function:

$$ (r,t)=\left(\sqrt{x^2+y^2},\tan^{-1}\left(\frac{y}{x}\right) \right) $$

Note that, as a consequence of this result, we have $\frac{\partial x}{\partial r}=\frac{\partial r}{\partial x}$.