Evaluating $\frac{d(\sin x )}{d(\cos x)}$ in a strange way.

132 Views Asked by At

I am aware this is too easy to solve. I have evaluated this in several ways (dividing both with $dx$, writing one in terms of other, geometrically etc.)

I am having trouble with this following method, I made two assumption and I guess one of them is wrong or both, or there is other problem.

The Method:

Assumption 1: $df=f(x+dx)-f(x)$

Assumption 2: When $dx$ is so small which it is, $\sin (dx)=dx$ and $\cos (dx)= 1-dx$

Using Assumption 1: $$D=\dfrac{d(\sin x )}{d(\cos x)}=\dfrac{\sin (x+dx)-\sin x}{\cos (x+dx)-\cos x} = \dfrac{\sin x\cos dx+ \sin dx\cos x-\sin x}{\cos x\cos dx-\sin x\sin \,dx-\cos x}$$

Then using Assumption 2:

$$D=\dfrac{\sin x(1-dx)+ dx\cos x-\sin x}{\cos x(1-dx)-\sin x \, dx-\cos x} =\dfrac{dx\,(\sin x- \cos x)}{dx\,(\sin x +\cos x)}=\dfrac{\sin x- \cos x}{\sin x +\cos x}$$

But it is not equal to $-\cot x$ which is obtained by other simple methods.

So where is the mistake? Thank you in advanced.

1

There are 1 best solutions below

0
On BEST ANSWER

The problem is that $\cos dx = 1$, not $1-dx$. Those assumptions follows from the Taylor series expansions: $$ \sin x = \sum_{n=0}^\infty \frac{(-1)^n}{(2n+1)!}x^{2n+1} = x + O(x^3), $$ $$ \cos x = \sum_{n=0}^\infty \frac{(-1)^n}{(2n)!}x^{2n} = 1 + O(x^2), $$

If $x$ is very small, the terms with $x^2$ and $x^3$ become very small and they can be neglected, leading to $\sin x \approx x$ and $\cos x \approx 1$.

If you proceed with the derivation you will get $-\cot x$.