Let $(X_{n})_{n\in\mathbb{N} }$ be a sequence of randon variables defined by $$X_{n}:=\mbox{maximum score obtained after } n \mbox{ throws of a fair dice.}$$
I need show that $(X_{n})_{n\in\mathbb{N} }$ is a Markov Chain. In this sense, we have to show that $$P(X_{n+1}=i_{n+1}|X_{1}=i_{1},X_{2}=i_{2},\ldots,X_{n}=i_{n})=P(X_{n+1}=i_{n+1}|X_{n}=i_{n})\tag{1}$$ I know that if we consider the random variable $Y_{k}$ which are the result of $k$th throw of our fair dice, then $(Y_{k})_{k\in\mathbb{N}}$ is iid with uniform distribution over $\left\{1,2,3,4,5,6\right\}$, so, we have $X_{n}=\max\left\{Y_{1},\ldots,Y_{n}\right\}$. Furthermore, the cumulative distribution of $X_{n}$ is $$F_{X_{n}}(k)=\left\{\begin{array}{ll} 0 & \mathrm{if}\: k<1 \\ \frac{i^{n}}{{6}^{n}} & \mathrm{if}\: i\leq k <i+1 \:\: \mathrm{for}\: i=1,2,3,4,5.\\ 1 & \mathrm{if}\:\: k\geq 6 \end{array}\right.$$
Therefore, (1) is equivalent to $$P(\max\left\{Y_{1},\ldots,Y_{n},Y_{n+1}\right\}=i_{n+1}|Y_{1}=i_{1},\max\left\{Y_{1},Y_{2}\right\}=i_{2},\ldots,\max\left\{Y_{1},\ldots,Y_{n}\right\}=i_{n})=P(\max\left\{Y_{1},\ldots,Y_{n},Y_{n+1}\right\}=i_{n+1}|\max\left\{Y_{1},\ldots,Y_{n}\right\}=i_{n})$$ I have not been able to prove this last, I need help in this matter.
Here is the Markov transition graph for your problem, where the states are labeled by the current maximum and $0$ represents the initial state:
All you need do is calculate the probabilities for the transitions and see that the final state (the sole attractor) is (in probability) $6$. Each transition depends solely on the current state, not the sequence that led to the current state.