Let $X \geq 0$ be a non-negative random variable, and $\mathcal{F}$ be any $\sigma$-algebra.
Does it hold that
$$X+\mathbb{E}[X]\geq \mathbb{E}[X|\mathcal{F}]$$
almost surely?
The intuition is clear to me: more information gives a better estimation of $X$. Thus the value of $\mathbb{E}[X|\mathcal{F}]$ should be in between $X$ and $\mathbb{E}[X]$. I get into trouble proving this rigorously. Any suggestions what theorems I can use?
The intuition is that $\mathbb{E}[X \mid \mathcal{F}]$ will be at least as close to $X$ as $\mathbb{E}[X]$ is, because it supplies more information about $X$. This is definitely not true a.s. For consider a case where $X=a>0$ with probability $1-2\epsilon$, $X=2a$ with probability $\epsilon$ and $X=c \gg a$ with probability $\epsilon$. Then for $0<\epsilon \ll a/c$, $\mathbb{E}[X] \approx a$. Consider $\mathcal{F}$ to be generated by $\{ X=a \}$. Then if $\omega$ is such that $X(\omega)=2a$, then $E[X \mid \mathcal{F}](\omega) \approx c/2$, while $c/2 - a \gg 2a-a$.
This is however true in most reasonable metrics that weigh events based on their probabilities. For example it is true in $L^2$, as follows from the law of total variance: $\operatorname{Var}(X)=\mathbb{E}[\operatorname{Var}(X \mid \mathcal{F})]+\operatorname{Var}(\mathbb{E}[X \mid \mathcal{F}])$.