Assuming that you start out with a root node, and decide with 50% probability whether or not to add two children nodes. If they do, repeat this process for them. How can you find the average length of this random binary tree?
I'm thinking along the lines of
$\displaystyle\lim_{n\to\infty}\sum\limits_{i=1}^n (\frac{n}{2})^2$.
because 1/2 * n represents 50% probability, and I'm squaring it because the tree gets exponentially larger. However, I feel like I've done something terribly wrong (probably because I have). Can anyone give me some help?
Hint: These problems can be done by what is known in probability as the first step analysis. Assume that the expected length is $L$. Take one step and try to come up with a relation involving $L$ and solve for $L$.
If you find first step analysis difficult, think along the lines of what the probability that the length of the tree is $L$ and then use this to find expectation. For instance, the probability the length is $0$ is $\frac{1}{2}$, the probability the length is $1$ is $\frac{1}{2} \times \frac{1}{4}$ and so on. Try to find a pattern and sum it up.