Let $X_n$ be the minimum of the first $n$ trials, where $X_n$ can take values from $\{1,2,3,4,5,6\}$ (e.g. rolling a dice). You can prove that $X_n$ is a Markov chain and it's not too hard: if $Y_n$ is the value of the $n$th trial,
$$\begin{align}&\text{ }X_n = \min(Y_n, Y_{n-1},\cdots,Y_1)\\ \Leftrightarrow&\text{ }X_n = \min(Y_n,\min(Y_{n-1},\cdots,Y_0))\\ \Leftrightarrow&\text{ }X_n = \min(Y_n,X_{n-1})\end{align}$$
so $\Pr(X_n=i_n\mid X_{n-1}=i_{n-1},\cdots,X_0=i_0)=\Pr(X_n=i_n\mid X_{n-1}=i_{n-1})$, which means $X_n$ satisfies the Markov condition.
My question is, how do you actually find the one-step transition probabilities $p_{ij}$? It's pretty easy to calculate them by head but I cannot do it rigorously. We need to find $\Pr(X_n=i\mid X_{n-1}=j)$ for every $i,j$, but dealing with minimums always wrecks my brain.
We certainly know that $Y_n\sim \text{Unif}(1,6)$, which means it has p.m.f. $p_{Y_n}(y)=\frac16$ for $y=1,\cdots,6$. But how do we use this fact to find the one-step transition probabilities?
A couple of observations to start: firstly, the Markov chain $X_1, X_2 \dots$ can never increase. Secondly, once we observe some $X_n = 1$ then we know $X_{n + 1} = 1, X_{n + 2} = 1, \dots$ because we have observed $Y_n = 1$.
In general we have $$\mathbb{P}(X_{n} = x_{n} \mid X_{n - 1} = x_{n - 1}) = \begin{cases} 0 & \text{if } x_{n} > x_{n - 1},\\ 1 - \frac{x_{n - 1} - 1}{6} & \text{if } x_{n} = x_{n - 1},\\ \frac{1}{6} & \text{if } x_{n} < x_{n - 1}. \end{cases}$$ If $x_{n} > x_{n - 1}$ then $\mathbb{P}(X_{n} = x_{n} \mid X_{n - 1} = x_{n - 1}) = 0$ because the Markov chain cannot increase. For any $x_{n} < x_{n - 1}$ the probability $\mathbb{P}(X_{n} = x_{n} \mid X_{n - 1} = x_{n - 1}) = \frac{1}{6}$ because $Y_n \sim Unif(1, 6)$. This leaves $\mathbb{P}(X_{n} = x_{n} \mid X_{n - 1} = x_{n - 1}) = 1 - \frac{x_{n - 1} - 1}{6}$ to make the probabilities sum to $1$.