I'm looking for a lower bound on the expected value of a smooth, non-negative, increasing function $\mathbb{E}f(X_t)$, $f(0)=0$ of the solution to a stochastic differential equation $X_t = x + \int_0^t b(X_s) ds + \int_0^t \sigma(X_s) dw_s$ ($x>0$).
I'm aware of many upper bounds based on linear growth and Lipschitz constants, e.g., $\mathbb{E}|X_t|^p \le Ce^{\alpha t}$ or $\mathbb{E}|X_t-X_s|^p \le Cg(|t-s|)$, etc.
For a lower bound I've played around with the second moment method, the reverse Markov inequality (like this), and flipped through Oksendal, K&S, R&Y, and Mao, but I'm stumped. From Markov's inequality and a Girsanov argument I can show that for any $t>0$, $\mathbb{E}f(X_t) \ge P[f(X_t)>1]>0$. However, I'm not aware of any results based on linear growth or Lipschitz constants in a similar manner to the results mentioned above, something like, say, $\mathbb{E}|X_t|^p \ge Cg(t)$ for some decreasing function $g(t)$. Is anyone aware of a result like this?
A suggestion: all the non-decreasing functions you mention, like $|x|^p$, also happen to be convex. If you can bound the drift term below, say by $b(x)\geq \beta$, then you can use Jensen's inequality: $$ \mathbb{E}f(X_t) \geq f(\mathbb{E}X_t) \geq f(x + \beta t), $$ because $\mathbb{E}X_t \geq x+\beta t$.