I'm asking if there are some relations/inequalities between those two:
$$ \mathbb{E}[f(X)(X - \mathbb{E}[X])] \tag{1}, $$ and
$$ \mathrm{Var}[X] = \mathbb{E}[(X - \mathbb{E}[X])^2] \tag{2}. $$
Let us assume $f$ is some regular bounded continuous funciton.
My guess is that (1) should be somehow upper bounded by the variance (2). And (1) $\to 0$ when $(2)\to 0$.
Because intuitively, when the variance (2) approaches to $0$, then $X$ should be infinitely close to its mean $\mathbb{E}[X]$ in probability, which means $(X - \mathbb{E}[X]) \to 0$ in probability measure, thus (1) $\to 0$.
In extreme case, when $\mathrm{Var}[X]=0$, then $X$ is degenerate, then (1)$=0$.
So how to formally prove this, and how to find such upper bound if possible?
Using the Cauchy-Schwarz inequality: $|\mathbb{E}(XY)|^2 \leqslant \mathbb{E}(X^2)\mathbb{E}(Y^2)$
$|\mathbb{E}[f(X)(X-\mathbb{E}[X])]|^2 \leqslant \mathbb{E}[f(X)^2]\mathbb{E}[(X-\mathbb{E}[X])^2]$
so $|\mathbb{E}[f(X)(X-\mathbb{E}[X])]|^2 \leqslant \mathbb{E}[f(X)^2] \text{Var}[X]$
Hence if $\text{Var}[x] \rightarrow 0$ then $\mathbb{E}[f(X)(X-\mathbb{E}[X])] \rightarrow 0$ (assuming $f$ is bounded)
Additionally if $x \mapsto f(x)^2$ is a concave function then Jensen's inequality could also be used: $\mathbb{E}[f(X)^2] \leqslant f(\mathbb{E}[X])^2$. This would decrease how much information about $f$ is needed.