Does longer input for a markov chain cause longer output on average?

16 Views Asked by At

When I have a simple markov chain with a fixed number of states and a fixed number of terminal states, does weighting transitions from a training set of longer sequences cause the chain to produce longer output when starting in initial state and following to a terminal state?

Real world example: A markov chain is created from a text corpus by a simple transition funktion $t(w_1)=w_2, t: W\to W$ with one special state $w_0$ for the start of a sentence and a set $S \subset W$ of terminal states, i.e. words which were at the end of a sentence.

Will a markov chain create longer sequences on average?

I am not sure, but it seems like it does not neccessarily have to, e.g. when the distributions of terminal states for a word and of non-terminal states for a word are similar.