Let $X=AB$,
$A$ and $B$ are random variables which are NOT independent and I know that $A>0$, $B\geq b >0$ with $b$ is a deterministic constant.
Then for any constant $a$, probability $P(X>a) > P(Ab>a)$ because $p=P(X>Ab)=1$.
Now if I know $p=P(B>b')$ with $b'>b$,
is there any relation (equality or inequality) among $P(X>a)$, $P(Ab'>a)$ and $p$ ?
Not quite an answer, but too long for a comment.
I think we need to be careful about equality. In the setup you said $A>0, B\ge b> 0$. In that case you can only conclude $X = AB \ge Ab$, but you cannot conclude $P(X > Ab)=1$. In fact, $P(X>Ab) = P(B>b)$ which can be any value from $0$ to $1$.
Also, you said $P(X > a) > P(Ab > a)$, but that is again untrue. What you can conclude is $P(X > a) \ge P(Ab > a)$, first because it is possible $B \equiv b$ in which case $X=Ab$ and the two probabilities are equal, and second because it is possible that for certain values of $a$ (e.g. $a < 0$) we have $P(Ab> a) =1$ already and $P(X > a)$ cannot possibly be $>1$.
Anyway on to your actual question: In general, $P(Ab' > a)$ only has to do with the marginal distribution of $A$, and $P(B > b')$ only has to do with the marginal distribution of $B$, but $P(X > a)$ has to do with the joint distribution. If $A$ and $B$ are dependent, you can make the $3$ distributions look like a lot of different things, so I don't think you can prove anything in general.
E.g. one natural way to proceed is:
$$ \begin{align} P(X > a) &= P(X > a | B > b') P(B > b') + P(X > a | B \le b') P(B \le b') \\ & \ge P(Ab' > a | B > b') P(B > b') + 0 \end{align}$$
So this has $2$ of the $3$ quantities you want. However, since $A$ and $B$ are dependent, we cannot even say $P(Ab' > a | B> b') = P(Ab' > a)$, so I don't see a way to proceed further. I think I can make up examples where $P(Ab' > a | B>b')$ is $>$ or is $< P(Ab' > a)$.