I am trying to understand the Shannon-Khinchin uniqueness theorem which states that entropy is the unique function that can measure uncertainty. This is what I have understood and may not be correct; so, if you think that it is wrong please tell me about it because thus you can help a lot:)
However, I also understood that the entropy itself was derived primarily for measuring uncertainty as it is a mean value of a set of uncertainties. So, if it was created to measure the uncertainty then why do we prove that it is the unique function to measure it? I mean if there would be plenty of function to choose from to measure it then existence of such a theorem is comprehensible but otherwise it is hard to understand its existence. I guess I am missing something in my understanding of why do we need such a theorem and I would really want to hear from you about what is the point of its existence.