I have this old examination assignment, where I have a function for some curve and the coordinates to a point. The subject is to determine the degree of the curve in the given point.
The expression of the curve is: $$ x^2 \sin{\sqrt{y}}+ye^{-2x}=1 $$ and the point is (0,1). The first thing I should do is to implicit differentiate the function. I am not really sure if I have done that right, here is my answer:
$$ 2x \cos{\sqrt{y}}\times\frac{1}{2}y^{-\frac{1}{2}}\times y'+y\times y'(-2)e^{-2x}=0$$
Is this right?
You need to use the product rule to differentiate, implicitly, $$x^2 \sin{\sqrt{y}}$$ and $$ye^{-2x}$$
recall: when you have the product of two functions $f(x)g(x)$, $$(f(x)g(x))' = f'(x)g(x) + f(x) g'(x)$$
So, for example, to differentiate $x^2 \sin{\sqrt y}$, we obtain:
$$2x\sin{\sqrt{y}}+x^2\cos({\sqrt y})\frac{y'}{2\sqrt{y}}\tag{1}$$
And to differentiate $ye^{-2x}$, we obtain: $$y'e^{-2x} + y(-2)e^{-2x} = y'e^{-2x}- 2ye^{-2x}\tag{2}$$
Now sum $(1), (2)$ and differentiate the the right-hand side of the original equation $(1)' = 0$ to obtain your derivative.