I think is this a very trivial question, but non the less: How can I show that the $ \hat\theta_n = $ $ \bar x $ is a consistent estimator of $ \theta _0 $.
Since $ \theta _o $ is $ \mu $ for the Logistic distribution, will this logic work:
Since $ P[ \lvert \hat\theta_n - \theta_0 \rvert \le \epsilon ] = 1$
$ \theta_0 = \mu $ and by the Weak Law of Large numbers, $ P[ \lvert \bar x - \mu\rvert \le \epsilon ] = 1$
Thus the $ \bar x$ is a consistent estimator of $ \theta_0 $
Note, for the Logistic Distribution, the log Likelihood, $ l(\theta) $ exists, but not in close form.
Weak Law of Large Numbers If $X_1, X_2, \ldots$ are a sequence of independent and identically distributed random variables then, $$\bar{X}_n:= \frac{X_1 + \cdots + X_n}{n} \xrightarrow{P} \mathbb{E}[X_1]$$ as $n \rightarrow \infty$.
If $X_i$ are Logistic with parameter $\mu$ then $\mathbb{E}[X_1] = \mu$. Therefore, $\bar{X}$ is a consistent estimator for $\mu$ follows immediately from the weak law of large numbers.
Note that if $\mathbb{E}[X_1]$ were different from $\mu$ then $\bar{X}_n$ would not be consistent for $\mu$.
You remark at the end about the log likelihood, but this is actually not needed since we are not using any property of maximum-likelihood estimators. Although maximum likelihood estimators are strongly consistent, you do not need to show that the MLE for $\mu$ is $\bar{X}_n$ and then use that property of MLEs. You can simply use the weak law of large numbers.