Let $(X_n)_{n\in\mathbb{N}_0}$ be a Markov chain with state space $E$. The hitting time of a set $A\subseteq E$ is a RV $$ H(A)\colon\Omega\to\mathbb{N}_0\cup\left\{\infty\right\},~~\omega\mapsto\inf\left\{n\in\mathbb{N}_0 | X_n(\omega)\in A\right\}. $$ The probability starting from $i\in E$ that $(X_n)_{n\in\mathbb{N}_0}$ ever hits $A$ is $$ h_i(A):=\mathbb{P}_i(H(A)<\infty). $$ For $i,j\in E$ we write $h_i(j):=h_i(\left\{j\right\})$. Prove, for all $i,j,k\in E$ that $$ h_i(k)\geqslant h_i(j)\cdot h_j(k). $$
I tried to prove that using $$ \begin{cases}h_i(A)=1, & i\in A\\h_i(A)=\sum_{j\in E}p_{ij}h_j(A), & i\notin A\end{cases} $$
Although I do not know if I can use that here and how to continue I at least have $$ h_i(k)=\sum_{s\in E}p_{is}h_s(k) $$ and $$ h_i(j)\cdot h_j(k)=\sum_{s\in E}\sum_{t\in E}p_{is}h_s(j)p_{jt}h_t(k) $$
Note that $h_i(k)\geqslant P_i(B)$, where $B$ denotes the event that the Markov chain hits $j$ and that, after it hits $j$, it hits $k$. Conditionally on the path of the Markov chain up to the hitting time $\tau$ of $j$, the probability to hit $k$ after $\tau$ is the probability that the Markov chain hits $k$ starting from $j$, that is, $h_j(k)$. Thus, $P_i(B)=h_i(j)\cdot h_j(k)$.