I hope you don't mind it is a little bit long. Actually it is not very complicated derivation.
Suppose $X\sim B(n, p)$, $Y \sim B(m, q) $, and 0 < p, q < 1. Both X and Y are either 0 or 1.
$p = \mathrm{P}(X=1|Y=1)$ and $q = \mathrm{P}(X=1|Y=0)$
According to Bayes theorem, I derive below equation, I label it as equation (1)
\begin{equation} \begin{split} log \frac{\mathrm{P}(Y=1|X)}{\mathrm{P}(Y=0|X)} &= log \frac{\mathrm{P}(X|Y=1)\mathrm{P}(Y=1)}{\mathrm{P}(X|(Y=0)\mathrm{P}(Y=0)} \\&=log \frac{\mathrm{P}(X|Y=1)}{\mathrm{P}(X|(Y=0)}+log\frac{\mathrm{P}(Y=1)}{\mathrm{P}(Y=0)} \end{split} \end{equation}
I represent $\mathrm{P}(X|Y=1)$ as $\mathrm{P}(X|Y=1) = X(\mathrm{P}(X=1|Y=1)) + (1-X)\left(1-\mathrm{P}\left(X=1|Y=1\right)\right)$ (formula 1)
Reasoning 1:
My reasoning about the formula 1 is that, since X is either 0 or 1, so when X=1, $\mathrm{P}(X|Y=1) = X(\mathrm{P}(X=1|Y=1))$, and when X=0, $\mathrm{P}(X|Y=1) = (1-X)(1-\mathrm{P}(X=1|Y=1))$. combining this two situation, I obtained $\mathrm{P}(X|Y=1) = X(\mathrm{P}(X=1|Y=1)) + (1-X)\left(1-\mathrm{P}\left(X=1|Y=1\right)\right)$. Whether my reasoning in this step is problematic or not?
If this step is correct, then with the same reasoning I obtained $\mathrm{P}(X|Y=0) = X(\mathrm{P}(X=1|Y=0)) + (1-X)\left(1-\mathrm{P}\left(X=1|Y=0\right)\right)$ (formula 2)
Recall equation (1), I substitute the formulas 1 and 2 in the equation (1), it gives,
\begin{equation} \begin{split} log \frac{\mathrm{P}(Y=1|X)}{\mathrm{P}(Y=0|X)} &= log \frac{\mathrm{P}(X|Y=1)\mathrm{P}(Y=1)}{\mathrm{P}(X|(Y=0)\mathrm{P}(Y=0)} \\&=log \frac{\mathrm{P}(X|Y=1)}{\mathrm{P}(X|(Y=0)}+log\frac{\mathrm{P}(Y=1)}{\mathrm{P}(Y=0)} \\&= log\frac{X(\mathrm{P}(X=1|Y=1)) + (1-X)\left(1-\mathrm{P}\left(X=1|Y=1\right)\right)}{X(\mathrm{P}(X=1|Y=0)) + (1-X)\left(1-\mathrm{P}\left(X=1|Y=0\right)\right)}+log\frac{\mathrm{P}(Y=1)}{\mathrm{P}(Y=0)} \end{split} \end{equation}
Recall $p = \mathrm{P}(X=1|Y=1)$ and $q = \mathrm{P}(X=1|Y=0)$, equation (1) could be represent as \begin{equation} \begin{split} log \frac{\mathrm{P}(Y=1|X)}{\mathrm{P}(Y=0|X)} &= log\frac{Xp + (1-X)\left(1-p\right)}{Xq + (1-X)\left(1-q\right)}+log\frac{\mathrm{P}(Y=1)}{\mathrm{P}(Y=0)} \end{split} \end{equation}
Reasoning 2:
For the expression $log\frac{Xp + (1-X)\left(1-p\right)}{Xq + (1-X)\left(1-q\right)}$, if X =1 , it equals to $log\frac{p }{q }$, if X =0, it equals to $log\frac{1-p }{1-q }$, so the expression $log\frac{Xp + (1-X)\left(1-p\right)}{Xq + (1-X)\left(1-q\right)}$ could be represented as $log\frac{Xp + (1-X)\left(1-p\right)}{Xq + (1-X)\left(1-q\right)} = X log\frac{p }{q } + (1-X)(log\frac{1-p }{1-q }) = X (log\frac{p }{q } -log\frac{1-p }{1-q }) + log\frac{1-p }{1-q }$. Is my reasoning in the step correct? Can I derive this expression in this way?
Finally, if my above dervation is correct, equation (1) could be represent as: \begin{equation} \begin{split} log \frac{\mathrm{P}(Y=1|X)}{\mathrm{P}(Y=0|X)} &= X (log\frac{p }{q } -log\frac{1-p }{1-q }) + log\frac{1-p }{1-q }+log\frac{\mathrm{P}(Y=1)}{\mathrm{P}(Y=0)} \\&= X (log\frac{p(1-q) }{(1-p)q }) + log\frac{(1-p)\mathrm{P}(Y=1)}{(1-q)\mathrm{P}(Y=0) } \end{split} \end{equation}
I am a little bit uncertain for two part of reasoning. It looks reasonable, but not very rigorous, so I want to ask for some advices. My question have been labelled in boldface. Could somebody help to check for me? Can I derive in this way when the variable is binary.
Your reasoning looks correct as long as $X,Y$ are Bernoulli random variables, and not generally Binomial as aluded to in the opening line.
To be strict, you should be using indicator random variables, $\mathbf 1_{X=1}$, $\mathbf 1_{X=0}$ rather that $X$ itself. However, that is a technicality when indeed $~X=\mathbf 1_{X=1}~$ and $~(1-X) = \mathbf 1_{X=0}~$.