I have a list of customers. Every six months the customer can renew with the company or choose to leave. I have data for 4 years. I have the probabilities that the customer chooses to renew or to leave for each of the eight 6 month periods. For example, the probability that a customer renews after period 1 is p_1=.79, the probability of renewal after period 2 is .81, period 3 is .86, period 4 is .92, period 5 is .91, period 6 is .89, period 7 is .91.
A customer is worth 50 for each period they choose to buy. Once they leave they cannot return.
I know that for any period the expected value is
$50*p + 0*(1-p)$
but how do I take into account all eight periods, would it be
$E(x)= 50*P_1 + 0*(1-P_1) + 50*p_2 +0*(1-p_2) + \ldots + 50*p_8 + 0*(1-p_8)$
This is actually a simple case. In reality there are than two states. For instance a third state might be that the customer is a client of a rival company. Is a markov chain applicable to this problem
This could be modelled with a discrete time Markov chain.
I think your answer for the expected value is oversimplified. Suppose you start with $n$ clients at the beginning. For period one, your expected value is $$ 50np_1.$$ For periods one and two, since you only have $n\cdot p_1$ clients left at the beginning of period two, the expected value is $$ 50n p_1 + 50n p_1 p_2,$$ and so on. The final value is $$ 50n\Bigl[ p_1 + p_1p_2 + p_1p_2p_3 + \ldots + p_1p_2p_3p_4p_5p_6p_7p_8 \Bigr]. $$