I know the derivative of $x^x$ is $x^x(1+log(x))$. So that must mean we cannot differentiate it like we do, say $x^5$.
The reason I read explaining why this is so is that the proof of the formula that the derivative of $x^n$(where n is a constant) $=$ $n.x^{n-1}$ involves binomial theorem.
But when I read the proof, I couldn't identify why there'd be any difference in applying that theorem for $x^x$.
So, I want to ask why can't we use the binomial theorem for justifying the same formula for, $x^x$ or $a^x$?
EDIT: I am explicitly asking why the 'binomial theorem proof' doesn't act as a proof for $x^x$, $a^x$ and so on. I appreciate the points given in the already existing answers, but I want a direct answer to my actual question? Thank you.
Learning formulas by heart is OK... when they work. When they do not, and we wonder why, we must go back to the formal definition: $$f'(a)=\lim_{h\to 0}\frac{f(a+h)-f(a)}{h}$$ So, when $f(x)=x^n$, you write: $$f'(a)=\lim_{h\to 0}\frac{(a+h)^n-a^n}{h}=\cdots$$ and some tricks allow me to simplify this. But note that in the subtraction, both terms have the same exponent $n$.
Now, if $f(x)=x^x$, write: $$f'(a)=\lim_{h\to 0}\frac{(a+h)^{a+h}-a^a}{h}$$ and there is no direct trick to compute this easily. So you must resort to the logarithm (which is indeed related to the definition of $x^x$) and the chain rule.