Why are the formula's of negative entropy and computational complexity so close?

79 Views Asked by At

The negative entropy for a vector $n$ with k dimensions is $ \Sigma_{k} nlogn$ defined as 0 for $n = 0$. Similarly the computational complexity for algorithms which repeatedly divide an input of size $n$ by a constant is $\mathcal{O}(n\log{}n)$. What is the relation or is it a coincidence ?