Implicit Differentiation Validity

1k Views Asked by At

I am interested in a rigorous understanding of how to find the slope of an implicit function, of two variables, at a given point.

So far, I have only found answers as deep as "its an application of the chain rule".

While this may work in practice, I am not convinced this is a rigorous justification by itself, and that there is a deeper justification needed.

Taking the standard definition of differentiation as $\frac{d}{dx}f(x) = lim_{h \to 0}\frac{f(x+h) -f(x)}{h}$, I find myself unable to apply this directly to implicit functions, given the inability to express the relation as a function, $f(x)$.

I don't doubt the idea of differentiation is very much the same in the implicit, or non-implicit case - but mechanically, given the definition available, I think more work is needed.

My current efforts revolve around seeing the implicit function, $R(x, y) = 0$ as a non-implicit function of two variables, $z = f(x,y)$, then trying to reason about the relation between $\frac{\partial}{\partial x}f(x,y)$ and $\frac{\partial}{\partial y}f(x,y)$ when we constrain $f(x,y) = z_0$.

Intuition, and a rigorous $\epsilon,\delta$ explanation are both welcomed

3

There are 3 best solutions below

1
On

For intuition, consider the function $z = f(x,y)$. We can take the total differential of $z$ to be $dz = \frac{\partial z}{\partial x} dx + \frac{\partial z}{\partial y} dy$. Now since $z$ is constant, $dz = 0$, which gives upon rearrangement that $\frac{dy}{dx} = -\frac{z_x}{z_y}$, which is generally how we use multivariable calculus to justify implicit differentiation. Also if you learn the implicit function theorem, the result follows as a special case. However the proof behind the theorem is a little involved.

0
On

Actually it's an case of the implicit function theorem. In this case the theorem says that if $R(x,y)$ is continuously differentiable in a neighbourhood of $(x_0, y_0)$, with $R(x_0, y_0) = 0$ and $\frac{\partial R}{\partial y}(x_0, y_0) \ne 0$, then there is a differentiable function $g$ defined on a neighbourhood of $x_0$ such that $R(x, g(x)) = 0$, and $$g'(x) = - \frac{\partial R/\partial x(x,g(x))}{\partial R/\partial y(x,g(x))}$$

0
On

This is not an entirely new answer, but meant to complement the existing ones by @Allawonder and @Robert Israel.

As @Allawonder's answer already explained, if by "$y$" we mean nothing more than an alternative notation for a function $f(x)$ we have in mind (e.g., when we are solving a differential equation), then ``implicit differentiation'' holds, simply by the fully rigorous machinery of real analysis: since $y=f(x)$, $F(x,y)$ is just some function $g$ of $x$ which is equal to the constant $0$, and therefore $g'(x)$ must also equal $0$, which gives you an expression for $f'(x)$.

However, the subtlety is that sometimes we are simply given an expression $F(x,y)=0$ without knowing whether such a function $f$ even exists (let alone its derivative). A common example is the equation of the circle $x^2+y^2 - 1 = 0$; at say $x=1$, it simply doesn't even make sense to write $f'(x)$. This is where we need the Implicit Function Theorem, which guarantees us if $F$ is "nice enough" (continuously differentiable), then such a function $y=f(x)$ exists and is differentiable, at least locally; in this case its derivative is given by the usual ``implicit differentiation'' result.