We all know this rule:
$\text{If } y = a^{f(x)} \text{ then } y' = a^{f(x)} \: f'(x)\ln a$
In my book there is the example:
Find $\frac{d}{dx}\left((x^{2} + 1)^{\sin x}\right)$
According to the rule, my answer is:
$(x^{2} + 1)^{\sin x} \cdot \cos x \cdot \ln(x^{2} + 1)$
But the answer of the book was:
$$(x^{2} + 1)^{\sin x}\left(\frac{2x \sin x}{x^{2} + 1} + \cos x \cdot \ln(x^{2} + 1)\right)$$
So, where did $\frac{2x \sin x}{x^{2} + 1}$ come from?
I will assume that this is not homework and go ahead to give a full solution.
As others have said, your rule only works for constant $a$. Following the hint in danielson's answer and setting $y=f(x)$, you have $$\log y=\sin x\log\left(1+x^2\right)$$ and differentiating both sides with respect to $x$ and using the chain and product rules gives $$\frac1y\frac{dy}{dx}=\cos x\log\left(1+x^2\right)+\frac{2x\sin x}{1+x^2}\quad\text{so that}\quad \frac{dy}{dx}=y\left(\cos x\log\left(1+x^2\right)+\frac{2x\sin x}{1+x^2}\right).$$Substituting back for $y$ gives the result.
Tell me if you don't understand any of the applications of the chain rule.