I recently came up with this question when playing around with binary numbers and tried coding up a solution to it:
Let $b$ be a string of 1s and 0s. Split the string into seperate parts such that the string is split only when a 0 is next to a 1 (or vice versa, of course). For example, 11100110 is split into 111, 00, 11, and 0. Calculate the number of digits in each part (in our example this would be 3, 2, 2, and 1, respectively), and take the sum of squares of the number of digits in each part (in our example this would be $3^2+2^2+2^2+1^2=18$). Let the procedures we just described be represented with a function $\omega$, e.g. $\omega(11100110)=18$. Furthermore, $b$ is a string such that the $k$th digit is 1 with probability $x_k$ (and 0 otherwise). Also let the length of $b$ be $L$. I am trying to find the expected value of $\omega(b)$ given $L$ and the set of $x_k$ for $1 \leq k \leq L$, but can't find an algorithm faster than brute force right now. What are some optimizations I can make when programmatically finding the expected value of $\omega(b)$? Also, I would like to know what the efficiency of the algorithm would be (preferably a lower bound on efficiency).
EDIT: The brute force algorithm refers to the $O(2^L \cdot L)$ one.
Write the string as $b=b_1,b_2,\dots,b_L$. For each $i\in \{1,\dots,L\}$, and each $j\in \{1,\dots,L\}$, define a random variable $A_{i,j}$ as follows: $$ A_{i,j}=\begin{cases} 1 & b_i\text{ and $b_j$ are in the same block}\\ 0& \text{otherwise} \end{cases} $$ You can show that $$ \omega(b)=\sum_{i=1}^L\sum_{j=1}^L A_{i,j}\tag1 $$ Think about it; for each block of length $k$, there is a contribution of $k^2$, and $k^2$ is the number of ordered pairs $(i,j)$ where $b_i,b_j$ are in that block.
By linearity of expectation, this means that $$ \mathbb E[\omega(b)]=\sum_{i=1}^L\sum_{j=1}^L\mathbb P(A_{i,j}=1)\tag2 $$ Furthermore, when $i<j$, we have $$ P(A_{i,j}=1)=x_ix_{i+1}\cdots x_j+(1-x_i)(1-x_{i+1})\cdots(1-x_j)\tag3 $$ since $b_i$ and $b_j$ are in the same block block if and only if $b_i,b_j$ and all in between are all $1$ or all $0$. You can compute $(2)$ in $O(L^2)$ time as long as you are careful. The pitfall is that $(2)$ is a sum of $L^2$ terms, and some terms are a product of up to $O(L)$ factors $x_i$, which would be $O(L^3)$ naively. That is why it is important to cache previous results and use them in the future; once you compute $x_2x_3x_4$, you should save it, then use it to quickly compute $(x_2x_3x_4)x_5$.