For random functions with $k$ input and output values, we can define the expected Shannon entropy of the output when the input has a uniform distribution (with the expectancy over all random functions). That's also the expected Shannon entropy for a distribution obtained as $k$ draws uniformly among $k$. It can be shown that's $(\log_2k)-\eta(k)$ with $\displaystyle\lim_{k\to\infty}\eta(k)=\eta=\frac 1{e\ln(2)}\sum_{i=1}^\infty\frac{\ln(i+1)}{i!}\approx0.827245389153$ bit.
Question: when instead we consider a random function with a larger input set than it's output set of size $k$, and a non-uniform distribution over the input set such that it has $\log_2k$-bit Shannon entropy, does that change the expected entropy loss from $\eta(k)$? If so, does that change it towards more or less entropy loss? Does the entropy loss still converges to $\eta$ regardless of the input distribution?
Motivation is this question.