Min. entropy $(H_{\infty})$of a set is defined as $ -\log_2(p_{max}) $ where $p_{max}$ is the highest probability of occurrence of any individual element of the set. There is also the $\chi^2$ distribution of those elements. I know how to calculate that too.
In my water I feel that these two metrics are closely related. Is it possible that there is a formula for calculating the standard deviation of the min.entropy? This seems comparable to calculating the standard deviation of the mean of a set. If it helps, I'm specifically interested in discrete sets of bytes, i.e. sets that are uniformly and randomly distributed as $\mathcal{U}(0, 255)$.
Having simulated this calculation for a sample set of 512,000 random bytes, and 100,000 independent trials, I get:-
Notice that this is not a Normal distribution. So from the above, $ E(H_{\infty}(X)) \approx 7.912 $ bits/byte.

Consider the uniform distribution on $\{0,1,\ldots,n-1\}$ and let $X$ have that distribution. The variance of this distribution is $(n^2-1)/12$ and its standard deviation is $\sqrt{(n^2-1)/12}.$ However, what we want is the standard deviation of the min-entropy of $X,$ not that of $X$ itself.
What happens when we sample? We obtain an empirical distribution, which is like a balls-in-bins process with each ball landing in each of $n$ bins independently and with equal likelihood.
If we throw $m$ balls then for $n$ large enough, and $m$ also growing with $m,$ there is a concentration phenomenon and the most loaded bin (i.e., the one with the highest empirical probability) has a load that is almost surely determined.
Note your $n$ is now $m,$ while the $n$ here is the number of outcomes, i.e., $n=256$ for your question.
See the paper by Raab and Steger available here or without a paywall here, specifically Theorem 1. If, for example, you have $m\leq c n \log n,$ then with high probability (and ignoring the fine dependence on $\alpha$ since it is multiplied by $\log \log n$ divided by $\log n$) the most loaded bin has no more than $$ \log n/(n \log n/m) $$ balls.
If however $m$ is much larger than $n$ (last case of theorem) then the most loaded bin has no more than $$ \frac{m}{n}+\sqrt{\frac{2m \log n}{n}} $$ balls.
If you divide these quantities by $m$ (number of balls) to get the highest empirical probability for a bin load and take the reciprocal and the log this should give you an estimate of the sample min-entropy.
However, experiments are probably required to see how large $n$ and $m$ need to be for this analysis to hold.