I'm a software dev, not a math wiz so bear with me. I'm trying to find an algorithm to express a sequence of ones and zeroes as a single decimal number between 0 and 1 based on how they are distributed.
The higher the concentration of 1s at the end of the sequence the closer the decimal to 1. If there are only 0s then the result is 0. If there are only 1s then the result is one.
So what I mean is for example
001011011111 would be something like 0.7 because most 1s are at the end
111111000 should be less, let's say 0.4 because the 1s tend to be at the beginning to the sequence.
I don't know if such a clustering (?) or tendency (?) algorithm exists. It would be even better if it could accept other numbers in the sequence, not just 1s and 0s.
Think of it as a winning or losing spree. If you have been winning recently (more 1s towards the end), the number should be higher.
Assign every digit position a weight, growing from left to right, such as k/n, where k is the position and n the total length. Then compute the weighted average