I am trying to understand the following asymptotic results in one of the article which gives no proof (and no reference either). I just can't see how trivial this is.
Given i.i.d. $h_i \sim \mathcal{CN}\left(\frac{\sqrt{\mu}}{\sqrt{1+\mu}}, \frac{1}{{1+\mu}}\right)$ that is having complex Gaussian normal distribution, and $|h|^2 := \max_{i=1,\cdots,K} |h_i|^2 $, (where $|h_i|^2$ is having a Chi-Square distribution with 2 degrees of freedom?), then asymptotically $$\lim_{K \rightarrow \infty}\frac{\mathbb{E}\left\{|h|^2\right\}}{\log_e K} \rightarrow \frac{1}{1+\mu}.$$
is it a well known (?) result? Based on this the authors prove other results. Can someone prove this and/or provide an appropriate reference? Many thanks in advance.
--8< -----------------------------------------------------------------------
Below is my attempt and not sure whether it makes sense. Please correct me if I am making blunder.
Since maximum function is a convex function, then utilizing the Jensen's inequality for the convex function, one could say that the $ \underbrace{\mathbb{E}\left\{f(x)\right\}}_{\mathbb{E}\left\{|h|^2\right\}} \geq \underbrace{f\left(\mathbb{E}\left\{x\right\}\right)}_{\max_{i=1,\cdots,K} \mathbb{E}\left\{|h_i|^2\right\}}$. So, if I use the lower bound, i.e., $f\left(\mathbb{E}\left\{x\right\}\right) := \max_{i=1,\cdots,K} \mathbb{E}\left\{|h_i|^2\right\} = \frac{1}{{1+\mu}}$, where we utilize the covariance of $h_i$ directly. Does this make sense?