I've been thinking about this for a while; and I'm pretty sure I've seen most of the text book explanations for it that are out there. I've found that, the explanations are pretty much all the same; they all use the same principle, the most basic of which is:
$a^\frac{1}{2} \times a^\frac{1}{2}= a^{\frac{1}{2}+\frac{1}{2}}=a^1$
and then they go on to say, "since the number $a^\frac{1}{2}$ has been multiplied by itself and has produced the number $a$, then it must be the square root of $a$". But, despite that, they then write:
$\sqrt{a}=a^\frac{1}{2}$
which, to me, doesn't make much sense. I do agree with the conclusion that $a^\frac{1}{2}$ is the square root of $a$; but we don't really know which one, how are we so sure that it's not actually $-\sqrt{a}$ which in its own right is one of a's square roots. Is this simply a convention, a rule? or is there something that I'm missing?