Finding the derivative of a function at an indicated point

368 Views Asked by At

I am having some trouble with some questions I am solving. I am aware they are simple; as I used to solve them without proof back in high school. Now since I am studying Mathematics I need to prove their work.

Question is: Use the definition of a derivative to calculate the derivative of $f(x)=x^2\cos x$ at $x=0$.

I know the answer is $f'(x)=2x\cos x-x^2\sin x$. I just can't prove it.

I know I have to use $f'(x)=lim x-> 0$ $(f(x)-f(0))/(x-0)$ since that is the definition. I get $f'(x)=\frac{x^2\cos x-0}x$ by doing that.

How do I go on with this question?

5

There are 5 best solutions below

0
On BEST ANSWER

You are on the right track. You just missed the limit part.

When you are calculating the derivative of a function $f(x)$, you don't just calculate $f'(x)|_{x=x_0}=\frac{f(x_0+h)-f(x_0)}{(x_0+h)-(x_0)}$. Rather, you have to compute the following limit:

$$f'(x)|_{x=x_0}=\color{red}{\lim_\limits{h\to 0}}\frac{f(x_0+h)-f(x_0)}{(x_0+h)-(x_0)}$$

In this case, $f(x)=x^2 \cos x$

Hence, the derivative will be $$f'(x)|_{x=0}=\lim_\limits{h\to 0}\frac{f(0+h)-f(0)}{(0+h)-0}$$ $$=\lim_\limits{h\to 0}\frac{h^2 \cos h - 0}{h}$$ $$=\lim_\limits{h\to 0}(h \cos h)$$ $$=\left(\lim_\limits{h\to 0}h\right) \cdot \left(\lim_\limits{h\to 0}\cos h \right)$$ $$=0 \times 1 = 0$$

This is totally as per the definition of a derivative.

Hope this helps you.

0
On

this is $$\frac{f(x)}{x}=x\cos(x)$$ and note that $$|x\cos(x)|\le |x|$$ and this tends to Zero if $x$ tends to zero

10
On

Note that at the point $x=0$

$$f'(0)=\lim_{x\to 0}\frac{x^2\cos x-0}{x-0}$$

and more in general at the point $x=x_0$

$$f'(x_0)=\lim_{x\to x_0}\frac{x^2\cos x-x_0^2\cos x_0}{x-x_0}$$

2
On

Hints

We know that $$\frac {d}{dx} ( h(x).\cdot g(x))=h'(x) g(x)+h(x)g'(x)$$ Hence with $h(x)=x^2$ and $g(x)=\cos x$

The $$f'(x)=2x\cos x-x^2\sin x$$

Because $$h'(x)=2x$$ and $$g'(x)=-\sin x$$

Now hence $f'(0)=0$

0
On

If I interpret your question correctly, you want to prove that
for two functions $f$ and $g$, $(f\times g)' = [f\times(g')+g\times(f')].$
The following proof is taken verbatim from Calculus, 2nd Ed., vol 1, 1966, by Tom Apostol.

$$f'(x) = \lim_{h\rightarrow 0} \frac{f(x+h)-f(x)}{h}.$$ Therefore, $$[f(x)\times g(x)]' = \lim_{h\rightarrow 0} \frac{f(x+h)\times g(x+h) -f(x)\times g(x)}{h}$$

$$ =\;\;\lim_{h\rightarrow 0}\left\{ g(x)\times \frac{f(x+h) - f(x)}{h} + f(x+h)\times\frac{g(x+h) - g(x)}{h}\right\}.$$

Since $f'$ is presumed to exist at $x, f$ is continuous at $x.$Therefore, as $h\rightarrow 0, f(x+h)\rightarrow f(x).\;$ Therefore, the above limit may be re-expressed as

$$ =\;\;\lim_{h\rightarrow 0}\left\{ g(x)\times \frac{f(x+h) - f(x)}{h} + f(x)\times\frac{g(x+h) - g(x)}{h}\right\}$$ $$ = g(x)\times f'(x) + f(x)\times g'(x).$$