I have been programming in SML and while implementing the function for natural square roots and natural cubic roots, I noticed that (real) cube roots somehow seem to have "more" irrational numbers as solutions in a given interval for x than in the (real) square roots.
So, for instance. Given an input interval [0,20] for x we find more irrational cube roots than we can find irrational square roots in the same interval. I first thought that it might have been a coincidence, so I repeatedly widened the interval for x and still noticed that cube roots have more irrational numbers as solutions than square roots. (If I could, I would try to prove it inductively, but I don't quite yet have a good sense of proof techniques, so I can only rely on trial here)
I was wondering if there is a sufficient explanation on why this is the case. I tried wrapping my head around it, but couldn't really find a solution. My first thought (although heuristic) was that the difficulty of finding a perfect cube is much higher than finding a perfect square.