Why is the letter "h" (or "H") used to denote entropy in information theory, ergodic theory, and physics (and possibly other places)?
Edit: I'm looking for an explanation of the original use of "H". As Ilmari Karonen points out, Shannon got "H" from Boltzmann's H-theorem. So (assuming Boltzmann actually used "H"), the original use is at least as early as that.
Wikipedia claims, citing "Gleick 2011", that Shannon got the letter $H$ from Boltzmann's H-theorem. Indeed, Shannon writes in his 1948 paper on page 393, after defining $H = -K \sum_{i=1}^n p_i \log p_i$:
Of course, this just changes the question to "Why did Boltzmann choose the letter $H$, then?" In this letter to the editor, published in Nature in 1937, Sydney Chapman writes:
So apparently, you're far from the first person to wonder about this.
Indeed (thanks to t.b. for the links), 30 years later, in a letter to the Americal Journal of Physics, Stephen G. Brush repeated Chapman's plea, and added that "Professor Chapman informed me, a couple of years ago, that he never received any response to this letter." 10 years later yet, in the same journal, Stig Hjalmars wrote in response to Brush's letter:
The cited "elsewhere" is "S. Hjalmars, TRITA-MEK-76-01, Technical Reports from the Royal Institute of Technology, Department of Mechanics, S-10044 Stockholm, Sweden," stated to be "Free of cost on request from the Department." Alas, I haven't so far managed to locate a copy of this report.