As in the title, I was wondering whether the entropy of a system (it can be any entropy, from Boltzmann to Renyi etc, it is of no importance) is a function or a functional and why? Since it is mostly defined as: $$S(p)=\sum_{i}g(p_i) $$ for some $g$ that has to be continous etc then it has to be a functional. But then I see that $S_{BG}$ for example, which is defined as $S_{BG}=\sum_i p_i \log p_i$ just needs the value of each $p_i$ in order to be defined, right?
The way I see it, it has to be a functional but it is not clear to me why. Also many authors mention the entropy as a function while others call it a functional.
Thank you!
A function is a mapping between a set of numbers and another set of numbers. A functional is a mapping between a set of functions and another set of functions. The entropy is defined as the Gibbs functional: $$S(p)=-k\sum_jp_j\log(p_j)$$ where the $p_j$ are functions. So the correct way to define the entropy, following Gibbs, is a functional