Suppose $X$ and $Y$ are independent random variables X~ Gamma(1,b) and Y~Gamma(n-1, b) . Hence, $X$ and $Y$ are positive. Then ${\frac{X}{X+Y}}$ ~ Beta ( 1 , n-1)
$g(X+Y)=E\left[1_{X >1} \mid X+Y\right]$ and c is a value X+Y =c
$g(c)= $$E\left[1_{X >1} \mid X+Y=c\right]$ =$E\left[1_{\frac{X}{X+Y}>\frac{1}{c}} \mid X+Y=c\right]=E\left[ 1_{\frac{X}{X+Y}>\frac{1}{c}}\right]$
Why is there independence? Is there a general rule?
Edit: adapted after the rewriting of the question introducing the Gamma distributions.
There is no general rule. In your setting, the identity crucially relies on the properties of the Gamma distribution. Specifically, the following, quoting Wikipedia:
The identity however is false in general, and really relies on the particular Gamma distribution. For a very siimple counterexample, consider $X$ uniform on $\{\frac{1}{4},\frac{3}{4}\}$, $Y$ uniform on $\{\frac{1}{10},\frac{7}{4}\}$ and independent of $X$, for instance, and $C=2$.
We first have, since $X<1$ a.s., $$ \mathbb{E}[\mathbf{1}_{\frac{X}{X+Y} > \frac{1}{C}} \mid X+Y=C] = \mathbb{E}[\mathbf{1}_{X>1} \mid X+Y=C] =0\,. $$ (Note that $X+Y=2$ has non-zero probability: namely, probability $1/4$).
However, without the conditioning, $$ \mathbb{E}[\mathbf{1}_{\frac{X}{X+Y} > \frac{1}{C}}] = \mathbb{E}[\mathbf{1}_{X>Y}] = \frac{1}{2}\,, $$ since $X>Y$ whenever $Y=1/10$.