Consider pdf of a bivariate normal distribution as follow:
$f(x,y)=\phi(x-z)\,\beta\, \phi(\beta(x+y))$,
where $\phi$ is PDF of standard normal distribution.
How can we show the truncated variance $Var(X| X<v, Y>2v)$ is increasing in $v$.
Numerically, I can show the variance is increasing in $v$.
Define \begin{align*} F=\int_0^{\infty}\int_{0}^\infty \beta \phi(\beta(x+y+v))\,\phi(x-v+z)\, dy d x \end{align*} where $\beta>0$ and $z$ are some parameters.
Define random variables $(X,Y)$ defined for positive values with the probability distribution function $f(x,y)=\frac{1}{F}\beta \phi(\beta(x+y+v))\,\phi(x-v+z)$. Throughout this proof, we use the symbol $E$ as the expected value operator with respect to distribution of $(X,Y)$.
It is easy to verify that
\begin{align*} &\partial_z \log(F)=v-z-\mu_x\\ &\partial_{z,z}\log(F)=-1+\sigma_x^2 \end{align*} where, $\mu_x$ and $\sigma^2_x$ are mean and variance of random variable $X$ and depend on parameters $v$ and $z$. Specifically, $\mu_x=\frac{1}{F}\int_0^{\infty}\int_{0}^\infty x \beta \phi(\beta(x+y+v))\,\phi(x-v+z)\, d y d x $ and $\sigma^2_x=\frac{1}{F}\int_0^{\infty}\int_{0}^\infty (x-\mu_x)^2 \beta \phi(\beta(x+y+v))\,\phi(x-v+z)\, d y d x $.
We can verify that the main problem is reduced to showing $\partial_{z,z,v} F>0$.
Define $\mu_t$ as expected value of the $X+Y$. We can evaluate $\mu_t$ as follow: \begin{align*} \mu_t&=\frac{1}{F}\int_0^{\infty}\int_{0}^\infty (x+y) \beta \phi(\beta(x+y+v))\,\phi(x-v+z)\, d y d x \\ &=-v-\frac{1}{\beta^2 F}\int_0^{\infty}\int_{0}^\infty \beta \phi'(\beta(x+y+v))\,\phi(x-v+z)\, d y d x\\ &=-v+\frac{1}{\beta^2 F} \int_{0}^\infty \beta \phi(\beta(x+v))\,\phi(x-v+z)\, d x \end{align*} Hence, $\beta^2(v+\mu_t)=\frac{1}{ F} \int_{0}^\infty \beta \phi(\beta(x+v))\,\phi(x-v+z)\, d x$, that we use below.
Since, $\partial_{z,z}\log(F)=-1+\sigma_x^2$, from definition of $\sigma^2_x$, \begin{align*} \partial_{z,z,v}\log(F)&=\partial_v \sigma_x^2=E\left[(X-\mu_x)^3-\beta^2 (X-\mu_x)^2(X+Y-\mu_t) \right]\\ &=E\left[(X-\mu_x)^3-\beta^2 (X-\mu_x)^2(X+Y+v-v-\mu_t) \right] \end{align*} We expand parts of this expectation as follow \begin{align*} &\beta^2 E\left[\beta^2 (x-\mu_x)^2(v+\mu_t)\right]=\sigma_x^2\beta^2(v+\mu_t)=\sigma_x^2\frac{1}{F} \int_{0}^\infty \beta \phi(\beta(x+v))\,\phi(x-v+z)\, d x\\ &E[\beta^2 (x-\mu_x)^2(x+y+v)]=\frac{1}{F}\int_0^{\infty}\int_{0}^\infty -(x-\mu_x)^2 \beta \phi'(\beta(x+y+v))\,\phi(x-v+z)\, d y d x\\ &=\frac{1}{F}\int_0^{\infty} (x-\mu_x)^2 \beta \phi(\beta(x+v))\,\phi(x-v+z)\, d x \end{align*}
Let $f_y(y)$ be marginal distribution of $Y$. Notice that $f_y(0)=\frac{1}{F}\int_0^{\infty} \beta \phi(\beta(x+v))\,\phi(x-v+z)\, d x$. Also notice that $ \frac{\beta \phi(\beta(x+v))\,\phi(x-v+z)\,}{\int_0^{\infty} \beta \phi(\beta(x+v))\,\phi(x-v+z)\, d x}$ is the conditional probability distribution function of $X$ given $Y=0$. Hence, \begin{align*} E\left[-\beta^2 (X-\mu_x)^2(X+Y+v-v-\mu_t) \right]=f_y(0)\left(-E[(X-\mu_x)^2|Y=0]+E[(X-\mu_x)^2]\right) \end{align*}
Hence, \begin{align*} \partial_{z,z,v}\log(F)=E\left[(X-\mu_x)^3\right]+f_y(0)(-E[(X-\mu_x)^2|Y=0]+E[(X-\mu_x)^2]) \end{align*}
Notice that $(X,Y)$ are truncated bivariate normal distribution with a negative correlation. Hence conditional distribution of $(X|Y=0)$ second order stochastically dominates marginal distribution of $X$. Then since $(x-\mu_x)^2$ is convex in $x$, we must have $E[(x-\mu_x)^2]>E[(x-\mu_x)^2|Y=0]$. Also, since $(X,Y)$ are truncated from below, we must have $E\left[(x-\mu_x)^3\right]>0$. In conclusion, $\partial_{z,z,v}\log(F)>0$.
(The last two claims can be written more rigorously!)