Suppose X is a continuous random variable that can take any value between plus and minus infinity. Furthermore, suppose A is a random variable capturing those events where X is below 0, and B is a random variable capturing those events where X is above 0.
Is there a general relationship between variance(X), variance(A), and variance(B)?
UPDATE: Siong and Canardini provide the same answer. Unfortunately, my simulation efforts do not agree, so I wonder where the mistake is:
In the following I draw 10 numbers (MATLAB, randn) and try to implement the given answers. What am I doing wrong?

$$X=A+B$$
$$Var(X)=Var(A)+Var(B)+2Cov(A,B)$$
$$Cov(A,B)=\mathbb{E}[AB]-\mathbb{E}[A]\mathbb{E}[B]=-\mathbb{E}[A]\mathbb{E}[B]\geq 0$$
$$Var(X)=Var(A)+Var(B)-2\mathbb{E}[A]\mathbb{E}[B]$$
$$Var(A)+Var(B) \leq Var(X)$$