The random variable $T_i$, the "Hitting Time of $i$" is defined to be the first $n$ such that $X_n=i$ given that $X_0=i$.
By the mean recurrence time of $T_i$, I mean the expected value of this random variable.
I wish to show that if $i$ is transient, then the expectation does not converge to any finite real number. While this, intuitively makes sense, I do not know how to formally prove this and any help is appreciated.
I don't think you have the definition quite right. The first passage time $T_i$ is the minimum $n$ such that $X_n = i$ given $X_0 = i$, and is defined to be $\infty$ if no such $n$ exists.
In that light, we just need to know that a state $i$ is transient if there's some other state $j$ such that $i \rightarrow j$, but $j \not\rightarrow i$. That is, if a (finite-state, discrete-time) Markov chain initialized at state $i$ reaches state $j$ it almost surely never returns to $i$. Because $i \rightarrow j$, this happens with positive probability and thus $T_i = \infty$ with positive probability. (Which implies $\mathbb{E}[T_i] = \infty$).
One reference you may find useful is these lecture notes.