Let the random variable $M$ be the largest rolled number after $k\in\mathbb{N}$ throws on an $n$-sided fair dice. The task is to compute $E[M]$ and $\lim_{n\rightarrow\infty}\frac{1}{n}E[M]$. Now since $M$ only takes values in $\mathbb{N}$, we first notice that $M$ can be written as $\sum_{m=1}^{\infty}1_{X\ge m}$. Now follows: $$ E[M]=\int_{\Omega}MdP=\int_{\Omega}\sum_{m=1}^{\infty}1_{M\ge m}dP=\sum_{m=1}^{\infty}P(M\ge m)=\sum_{m=1}^{n}P(M\ge m) $$where $\Omega=\{1,...,n\}^k$ is the probability space. We have that:$$ P(M\ge m) =1-P(M<m)=1-(1-\frac{1}{n})^k $$is the probability of hitting $m$ at least once given $k$ throws. If you plug that into the sum: $\sum_{m=1}^{n}P(X\ge m)=\sum_{m=1}^{n}1-(1-\frac{1}{n})^k=n-n(1-\frac{1}{n})^k$
It looks weird to me that we have $\lim_{n\rightarrow\infty}\frac{1}{n}E[M]=0$. There must be some misunderstanding of me, since I expected a more interesting result computing the limit. I hope someone can clarify me!
Your reasoning is correct but $P(M<m)=(1-\frac{1}{n})^k$ is wrong.
Actually $$P(M<m)= \prod P(X_i<m) = \left(\frac{m-1}{n}\right)^k \tag 1$$
$$E[M]=n-\sum_{m=1}^n \left(\frac{m-1}{n}\right)^k = n- \sum_{t=0}^{n-1} (t/n)^k \tag 2 $$
The last sum tends, as $n \to \infty$, to the Riemman sum of $\int_0^1 x^k = 1/(k+1)$
Hence
$$E[M] \to n - \frac{n}{k+1}= n \frac{k}{k+1} \tag 3$$
and $$\frac{E[M]}{n} \to \frac{k}{k+1} \tag 4$$
You could guess that your result $\frac{E[M]}{n} \to 0$ could not be right, by considering the case $k=1$: here, clearly $E[M]/n= (n+1)/2n \to 1/2$
The correct result above $(4)$ could be guessed by considering that when $n$ grows the (normalized) distribution approximates a continouos uniform variable on $[0,1]$. And the expected maximum of $k$ occurences of such variable is $k/(k+1)$ (ref)