During a proof of quasi-left continuity of Lévy processes, there is the next step:
$$E(f(X_{T})g(X_{T}))=E(f(X_{T^{-}})g(X_{T}))$$ for all $f,g\in C_{0},$ where $C_{0}$ is the set of all continuous functions vanishing at infinity and $T$ is stopping time.
Because of the above we conclude that $X_{T}=X_{T^{-}}$ a.s.
I was thinking in the following:
For the equality between expectation above, we have $$f(X_{T})g(X_{T})=f(X_{T^{-}})g(X_{T}) \quad \text{a.s.}$$ Then $$f(X_{T})=f(X_{T^{-}}) \quad \text{a.s.}$$ The above holds for every $f\in C_{0},$ so $X_{T}=X_{T^{-}}$ a.s. but I have doubt in this argument. I don't think this is immediatly true, isn't it?
How could be conclude this correctly?
Any kind of help is thanked in advanced.
Note that $$\mathbb{E}Y = \mathbb{E}Z$$ does not imply $Y=Z$ almost surely, and therefore the very first step of your reasoning doesn't work.
The idea is, essentially, to show that
$$\mathbb{E}(f(X_{T}) g(X_T)) = \mathbb{E}(f(X_{T-}) g(X_T)) \tag{1}$$
implies that the random vectors $U:=(X_T,X_T)$ and $V:= (X_{T-},X_T)$ have the same distribution. Once we know this, we can conclude that $X_T-X_T$ and $X_T-X_{T-}$ have the same distribution, and so $X_T-X_{T-}=0$ in distribution. This, however, already implies $X_T-X_{T-}=0$ almost surely.
There are several ways to prove that $U$ and $V$ have the same distribution; one possibility is to use a monotone class (or approximation) argument to show that $(1)$ implies
$$\mathbb{E}(h(X_T,X_T)) = \mathbb{E}(h(X_{T-},X_T))$$
for a suitable large class of functions $h$. Alternatively, we can observe that $(1)$ holds for bounded continuous functions $f,g$ (this follows from a simple truncation argument), and so, in particular,
$$\mathbb{E}\exp \big( i \xi X_T+ i \eta X_T \big)=\mathbb{E}\exp \big( i \xi X_{T-}+ i \eta X_T \big)$$
for any $\xi,\eta \in \mathbb{R}$. This means that $U$ and $V$ have the same characteristic function, and so $U=V$ in distribution.