I recently came across the theorem on large deviations for finite Markov chains, which I have some trouble using it. In order to state the theorem we need the following notations first: Let $\Sigma$ be some finite alphabet, $\pi$ an irreducible transition matrix, $\sigma$ an initial state and let $Y_1,...,Y_n,...$ be the Markov chains defined by $\pi$.
Given some function $f:\Sigma\to\mathbb{R}^d$ we set $Z_n=\frac{1}{n}\sum_1^n f(Y_i)$, and for $\lambda\in \mathbb{R}^d$ we define the matrix $\pi_\lambda (i,j)=\pi(i,j)e^{<\lambda,f(j)>}$ and set $\rho(\pi_\lambda)$ to be its Perron-Frobenius eigenvalue.
Now that we have all the definitions, the theorem states that for any $\Gamma \subseteq \mathbb{R}^d$ we have that $$\limsup_{n \to \infty}\frac{1}{n} \log P_\sigma ^\pi (Z_n \in \Gamma)\leq - \inf_{z\in \bar\Gamma} I(z)$$ where $$I(z)=\sup_{\lambda \in \mathbb{R}^d} \{ <\lambda,z>-\log \rho(\pi_\lambda)\}$$
The main goal of course is to show that for suitable choice of $\Gamma$ the expression $\inf_{z\in \bar\Gamma} I(z)$ will be positive and then we get an exponential bound on the probability. Unfortunately, I have no idea how this function looks like, except that it is nonnegative.
In the i.i.d case there is a similar bound which uses the relative entropy to measure how far away $\Gamma$ is from the right limit (namely the expectation of f). What I would like to know is if a similar phenomenon happens here also and if so how to show it. Specifically, I'm interested in $f$ which is a characteristic function of some subset of $\Sigma$.