Suppose $X \sim Bin(n,p)$ and $Y \sim Bin(n,1-p)$, How is $X+Y$ distributed? I know that for independent variables one can do the same as:
Sum of two independent binomial variables
Furthermore i have seen this post:
Addition of two Binomial Distribution
However $X$ and $Y$ do not necessarily need to be independent of each other.
Backstory: I am trying to calculate the entropy of some $Z = X_1 + X_2 + \dots + X_n$ where each $X_i$ is either $Bin(n,p)$ or $Bin(n,1-p)$ depending on its parent node.
For example lets take a graph that has only outgoing edges of degree 2 beginning from some source $X_0 \sim (\frac{1}{2},\frac{1}{2})$. If we compare layer $2$ and layer $3$, we sent $2^3$ nodes to $2^4$ nodes. The probability for a set of child nodes to get certain states is $Bin(n,p)$ when the parent has state $1$ and $Bin(n,1-p)$ if the parent has state $-1$. We proceed this until we reached some threshold layer $d$.
What is the probability distribution of $Z = \sum X_i^{(d)}$
Assume a tree with binary offspring. Assume the root is generation 0. Take last generation $X_{n,1},\ldots,X_{n,2^n}$ be the bernoulli variables for generation, let $N_n = \sum X_{n,i}$, let $\phi_p$ be the characteristic function for $Bin(2,p)$, and $\phi_q$ the moment generation function for $Bin(2,q)$. Then we have (as can be seen by conditioning on the last generation) $$ E(\exp(sN_n)) = E\phi_p(s)^{X_{n-1,1}}\phi_q(s)^{1-X_{n-1,1}}\cdots = E\left(\phi_q(s)^{2^{n-1}}\left ( \frac{\phi_p(s)}{\phi_q(s)} \right ) ^{\sum_i X_{n-1,i}}\right) $$ Now this can be rewriten as $$ CE\exp(s_{n-1}N_{n-1}) $$ with $$ C = \phi_q(s)^{2^{n-1}} $$ and $$ s_{n-1} = \ln \left (\frac{\phi_p(s)}{\phi_q(s)} \right ) $$ Now use induction. If one is after probabilities consider using https://en.wikipedia.org/wiki/Probability-generating_function and note $$ Ez^N = E(\exp(ln(z) N)) $$