If the number of messages in the set is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely.
in the translate version of my language, the sentence is translated like below.
만약 전체 집합 중 메시지의 수가 유한하고 모든 선택지들이 동등한 확률을 가지는 경우, 어떤 메시지가 집합에서 선택되었을 때의 숫자 혹은 이 숫자의 단조함수(monotonic function)를 산출된 정보 측정치라고 간주할 수 있다.
If the number of messages in a whole set is finite and all choices have equal probabilities, then we can regard a number or the monotonic function of this number of which a message is chosen in the set as the measure of the produced information.
But I don't know the meaning of the "this number" in this context.
first, I'd thought "this number" as the cardinality of the set of which the whole possible messages are contained.
but How can the function that has the cardinality of a set as its free variable(x) imply the measures of the information?
thank you.
Your original idea was correct. With $n$ equiprobable options, some growing function of $n$ quantifies how much we learn from settling which option is correct. The choice $\log_2 n$ is common to count the number of bits needed to specify the result, although it can be convenient to instead use natural logarithms.
If outcomes aren't equiprobable, we can take the information learned from a probability-$p$ outcome as $-\log p$, generalising the original case $p=1/n$; but since $p$ varies by option, we'd average over such results viz. the Shannon entropy $-\sum_ip_i\log p_i$.