Let $V(x,y)\in C^\infty$ be a uniformly convex function with polynomial growth, i.e. there exist $\alpha>0$, $n\in\mathbb N$, $k>0$ such that $$Hess V(x,y) \,\geq\, \alpha\, I \quad\textrm{for all }(x,y)\in\mathbb R^{d_1}\times\mathbb R^{d_2}$$ $$V(x,y) \,\leq\, k\,(|x|^{2n}+|y|^{2n}) \quad\textrm{as }|x|^2+|y|^2\to\infty \;.$$
We may assume w.l.o.g. $\int\!\int e^{-V(x,y)} dx\,dy=1$, so that $e^{-V(x,y)}dx\,dy\,$ is a log-concave probability measure.
I am interested in the conditional probabily measure. In particular I wonder if -maybe under additional assumption on $V$ and its derivatives- we have $$ \frac{\int |x|^{2p}\,e^{-V(x,y)}\,dx}{\int e^{-V(x,y)} dx} \,-\, \left(\frac{\int |x|^{p}\,e^{-V(x,y)}\,dx}{\int e^{-V(x,y)} dx}\right)^2 \,\leq\, h_1\,|y|^{2p} + h_0$$ for any $p \in \mathbb N$ and for suitable constants $h_1,h_0\geq0$ (that depend on $p$ and $V$).
In other terms: is the variance of $|x|^{p}$ given $y$ bounded by a polynomial in $|y|$ ?
Example. The answer is yes in the case of Gaussian measure. Indeed if $e^{-V(x,y)}dx\,dy\,$ is the Gaussian measure with mean $0$ and positive-definite covariance matrix $A$, then the conditional measure given $y$ turns out to be the Gaussian measure with mean $A_{12}A_{22}^{-1}y$ and covariance matrix $A_{11}-A_{12}A_{22}^{-1}A_{12}^T$. Then a simple change of variable $x\mapsto x+A_{12}A_{22}^{-1}y$ allows to conclude.
Edit. After computations in the case of Gaussian measure, and after some simple numerics for a non-Gaussian measure ($2n=4$), I suspect the same inequality could hold true also for expectations. Namely: $$ \frac{\int |x|^{p}\,e^{-V(x,y)}\,dx}{\int e^{-V(x,y)} dx} \,\leq\, h_1\,|y|^p + h_0$$ for every $p\in\mathbb N$ and for suitable constants $h_0,h_1\geq0$ which depend of $p$ and $V$. Of course this would imply the previous inequality for variances.
Edit 2. By continuity of the functions involved, an equivalent formulation of the previous inequality is: $$ L\equiv \limsup_{|y|\to\infty} \frac{\int |x|^{p}\,e^{-V(x,y)}\,dx}{|y|^p \int e^{-V(x,y)} dx} <\infty \;.$$
Edit 3. Conjecture: if $V(x,y)$ is a uniformly convex polynomial of degree $2n\geq 2$ and $p\in\mathbb N$, there exists a positive-definite covariance matrix $A$ such that $$ \frac{\int |x|^{2p} e^{-V(x,y)} dx}{\int e^{-V(x,y)} dx} \,\leq\, \frac{\int |x|^{2p} e^{-G(x,y)} dx}{\int e^{-G(x,y)} dx}$$ where $G(x,y) = \frac{1}{2}\langle A^{-1}(x-x_0,y-y_0) , (x-x_0,y-y_0)\rangle $ and $(x_0,y_0)$ is the unique minimum point of $V$. Now in the Gaussian case we can perform computations explcitly, finding $$ \frac{\int |x|^{2p} e^{-G(x,y)} dx}{\int e^{-G(x,y)} dx} \leq h_1\,|y|^{2p} + h_0 $$ for suitable constants $h_0,h_1$. Thus if the conjecture holds true, it would prove the desired bound for conditional moments, at least in the case $V$ is a polynomial.