Calculate the information amount (entropy) of a sentence

347 Views Asked by At

How I should calculate the entropy of a sentence of symbols for example "this is a string", should I use the Shannon's formula and use the probability of each symbol of the sentence depending on the table of probabilities of english language symbols and it's a too long operation, or just calculate the entropy of English language by Hartley's formula log2(27) and then the result multiply by the length of the sentence?