I believe there should exist tighter-than-Markov Inequalities that use the same information as the markov inequality (just expectation). Consider the proof of the markov inequality in the link below:
where the key stage is observing
$$ E[X] \ge \int_{t}^{\infty}{x f(x) dx} \ge t \int_{t}^{\infty}{f(x)}$$
Are there any strictly positive probabilitiy distributions $f$ (from $0$ to $\infty$) where
$$ \int_{t}^{\infty} x f(x) dx = t \int_{t}^{\infty} f(x) dx $$
If not what i'm curious what distribution (outputting for positive real valued functions) minimizes
$$ \int_{t}^{\infty} x f(x) dx - t \int_{t}^{\infty} f(x) dx $$
Since if we find $Q(t)$ such that
$$ \int_{t}^{\infty} x f(x) dx - t \int_{t}^{\infty} f(x) dx \ge Q(t) $$
for all positive distributions,
Then a tighter inequality of the form
$$ P(x \ge t) \le \frac{E[X]-Q(t)}{t} $$
Could be generated.
The usefulness of Markov's (and Chebyshev's) inequality is that it works great for a bunch of situations where you need a crude bound in terms of something you know or can estimate, in this case expectation, variance, etc. So, while what you have is correct, in practice it is not generally as useful because your $Q(t)$ function is difficult to estimate.
With that being said, the theory of large deviations is in some sense a search for tighter Markov inequalities, with both upper and lower bounds on $P(X\geq a)$. For example the usual trick is to use:
$$P(X\geq a) = P(e^{tX}\geq e^{ta})\leq \frac{E[e^{tx}]}{e^{ta}}=f(t),$$
and then minimize $f(t)$ using standard calculus, assuming you can calculate the moment generating function of $X$. These are generally called Chernoff Hoeffding bounds. Here is another list.