I'm new to this field. To my knowledge, entropy is defined so that the higher the value is, the more Information or uncertainty it contains. But I also noticed that the uniform distribution maximized the entropy, which seems not to be so uncertain. Also I have read for example in Ramanathan Gayathri & Enchakudiyil Ibrahim Abdul Sathar, 2021. "On past geometric vitality function of order statistics" that
"From Table 1, it can be observed that in some situations PGVF results in smaller values compared to the past entropy and DCPE measures. Hence one may conclude that PGVF holds more information than some of the existing reliability measures."
So could anyone give me idea, why can author say that when comparing different entropy measures?
A uniform distribution is maximally uncertain - you have no indication that any one outcome is more likely than any other.
As for thinking of it terms of information, I don't think of it "containing" information, I think of it as the amount of information you gain, on average, upon being told one outcome.
As an example, the distribution of colors of polar bears is very non-uniform, and "the polar bear is white" doesn't tell you much. But being told "the car we're looking for is white" is more likely to help you find the right one.
You might reply "well yes, but if I found out the polar bear was brown, I'd have a whole lot more information." The reply to that is "yes, but that doesn't happen that often.", i.e. the information gain is averaged over all possibilities, weighted by how often they happen.