The characterization of a Markov Chain as finite or infinite refers to the state space $Ω$ in which the Markov Chain takes it's values. Thus, a finite (infinite) Markov chain is a process which moves among the elements of a finite (infinite) set $Ω$. In other words, if $X_n$ describes the state of the process at time $n$ then $$X_n\inΩ$$ If $Ω$ is finite f.e. $$Ω=\{1, 2, \ldots, n-1, n\}$$ then $(X_n)_n$ is called a finite Markov Chain, otherwise if $Ω$ is infinite. f.e.
$$Ω=\{1, 2, \ldots, n-1, n, \ldots\}$$ then the Markov Chain $(X_n)_n$ is calles infinite.
The characterization of a Markov Chain as finite or infinite refers to the state space $Ω$ in which the Markov Chain takes it's values. Thus, a finite (infinite) Markov chain is a process which moves among the elements of a finite (infinite) set $Ω$. In other words, if $X_n$ describes the state of the process at time $n$ then $$X_n\inΩ$$ If $Ω$ is finite f.e. $$Ω=\{1, 2, \ldots, n-1, n\}$$ then $(X_n)_n$ is called a finite Markov Chain, otherwise if $Ω$ is infinite. f.e. $$Ω=\{1, 2, \ldots, n-1, n, \ldots\}$$ then the Markov Chain $(X_n)_n$ is calles infinite.