Given a string of random symbols with yet a priori unknown distribution, what are the known algorithms to compute its Shannon entropy?
$$H = - \sum_i \; p_i \log p_i$$
Is there an algorithm to compute it without calculating the probabilities $p_i$ first? Having calculated the entropy $H_n$ of the first $n$ symbols can I find the entropy $H_{n+m}$ of the $n+m$ symbols (knowing about the first $n$ symbols only $H_n$)?
I doubt it, but I don't have a proof.
No. Suppose $H_n = 0$ and the final $m$ symbols are $b\ldots b$. You don't know whether $H_{n+m} = 0$ or $$H_{n+m} = -\sum_{i\in\{n,m\}} \frac{i}{n+m} \log \frac{i}{n+m}$$