Question from an old exam:
In a full binary tree of height $n$, each vertex is randomly colored either white or black. A path is considered good if it goes from any vertex to a leaf and all vertices on this path are of the same color. Let $X$ denote the number of good paths. For example, for a tree of height $n = 3$ in the figure, we have $X=8$ ($4$ good paths consisting of $1$ vertex, $3$ good paths consisting of $2$ vertices, and $1$ good path consisting of $3$ vertices).
a) calculate the expected value of $X$.
b) Calculate the variance of $X$. It is enough to provide a formula that is the sum of a polynomial number of terms in compact form (you do not have to calculate this sum).
My solution to a)
Let $f(n)$ denote the number of good paths for a tree of height $n$.
Then $f(n) = f(n-1) + f(n-1) + \frac{1}{4}(2+1+1+0) = 2f(n-1) + 1$, because
the number of paths in a tree of height n is the number of paths in both subtrees of height $n-1$ plus either zero, one, or two new paths created by adding one vertex as the new root.
We have $f(1)=1$ (since a leaf is also a path) and after simplification, it turns out $f(n)=2^n-1$.
Alternatively, I tried to define random variable (so it can hopefully be used to find variance): $$ X_i = \begin{cases} 1, & \text{if the path of length } i \text{ is monochromatic}, \\ 0, & \text{otherwise}. \end{cases} $$
Then $X = \sum_{i=1}^{n} X_i \cdot 2^{n-1}$, because we sum after each path length, and each path occurs $2^{n-1}$ times.
Calculating the expected value this way, we get the same result as before, because it comes out $X = \sum_{i=1}^{n} \left(\frac{1}{2}\right)^{i-1} \cdot 2^{n-1} = 2^n - 1$.
However, I don't know how to find the variance.
Is part (a) correct? How can one solve part (b)?

Let $X_n$ be the number of good paths and $Y_n$ the number of good paths starting from the tree's root. $L_n$ and $R_n$ are the events where the root is the same color as the left and right child.
\begin{align} X_n &= X^{(\ell)}_{n-1}+X^{(r)}_{n-1}+Y_n\\ Y_n &= \mathbf 1_{L_n}Y^{(\ell)}_{n-1}+ \mathbf 1_{R_n}Y^{(r)}_{n-1} \end{align}
Computing $\mathbb E\left[Y_n\right]$: using the second equation, you can see that, $$\mathbb E\left[Y_n\right] = \mathbb E\left[Y_{n-1}\right] = \mathbb E\left[Y_1\right] = 1$$
Computing $\mathbb E\left[X_n\right]$: using the first equation, $$\mathbb E\left[X_n\right] = 2\mathbb E\left[X_{n-1}\right] + \mathbb E\left[Y_n\right] = 2\mathbb E\left[X_{n-1}\right] + 1$$ a simple induction proves that $$\mathbb E\left[X_n\right] = 2^n - 1$$
Computing $\mathbb E\left[Y_n^2\right]$: using the second equation, $$Y_n^2 = \mathbf 1_{L_n}\left(Y^{(\ell)}_{n-1}\right)^2 + \mathbf 1_{R_n}\left(Y^{(r)}_{n-1}\right)^2 + 2\times\mathbf 1_{L_n}\mathbf 1_{R_n}\left(Y^{(\ell)}_{n-1}\right)\left(Y^{(r)}_{n-1}\right)$$ so $$\mathbb E\left[Y_n^2\right] = \mathbb E\left[Y^2_{n-1}\right] + \frac12$$ a simple induction proves that $$\mathbb E\left[Y_n^2\right] = \frac12 (n+1)$$
Computing $\mathbb E\left[X_nY_n\right]$: $$X_n Y_n = \mathbf 1_{L_n} X^{(\ell)}_{n-1}Y^{(\ell)}_{n-1} + \mathbf 1_{R_n} X^{(\ell)}_{n-1}Y^{(r)}_{n-1} + \mathbf 1_{L_n} X^{(r)}_{n-1}Y^{(\ell)}_{n-1} + \mathbf 1_{R_n} X^{(r)}_{n-1}Y^{(r)}_{n-1} + Y_n^2$$ then $$\mathbb E\left[X_nY_n\right] = \mathbb E\left[X_{n-1}Y_{n-1}\right] + \mathbb E\left[X_{n-1}\right]\mathbb E\left[Y_{n-1}\right] + \mathbb E\left[Y^2_n\right] = \mathbb E\left[X_{n-1}Y_{n-1}\right] + 2^{n-1} - 1 + \frac12 (n+1)= \mathbb E\left[X_{n-1}Y_{n-1}\right] + 2^{n-1} + \frac12n - \frac12$$ a simple induction proves that, $$\mathbb E\left[X_nY_n\right] = 2^n - 1 + \frac1{12}n(n-1)(2n-1)$$
Computing $\mathbb E\left[X^2_n\right]$: $$X_n^2 = \left(X^{(\ell)}_{n-1}\right)^2 + \left(X^{(r)}_{n-1}\right)^2 + 2 \left(X^{(\ell)}_{n-1}\right)\left(X^{(r)}_{n-1}\right) + 2X_nY_n - Y_n^2$$ then $$\mathbb E\left[X_n^2\right] = 2\mathbb E\left[X_{n-1}^2\right] + 2\mathbb E\left[X_{n-1}\right]^2 + 2\mathbb E\left[X_nY_n\right] - \mathbb E\left[Y_n^2\right]$$