A 'bouncy' stock is one that rises or falls by $10\%$ value everyday. If you purchase it at $100$ usd you must hold it for $100$ days. Buy or not?

88 Views Asked by At

Suppose there is a stock worth $100\$$ and every day the price fluctuates by $10\%$, up or down (with $0.5$ prob each). If I was to buy it today I would be required to hold it for $100$ days. Should I buy it or not?

Let $X$ denote the price after a $100$ days. Clearly I need to anticipate $\mathbb{E}[X]$

I have three answers to this, but all feel hand wavey and am looking for something more concrete:

Intuition based answer: We expect $50$ increases and $50$ decreases over these $100$ days. Which would mean the price on average would be $\mathbb{E}[X] = 100\cdot(0.9)^{50}(1.1)^{50} = 100\cdot(0.99)^{50} < 100$ so the intuition would be to not invest.

I think the answer (not invest) is right, but I'm pretty sure my exact 'calculation' for $\mathbb{E}[X]$ is wrong.

Martingale Based Answer: If $(X_t)_t$ is a stochastic process where $X_t$ is the price on day $t$, this is clearly a martingale as obviously $\mathbb{E}[X_{t+1}|X_t] = 0.5*1.1*X_t + 0.5*0.9*X_t = X_t$.

At this point I wish to vaguely appeal to Doob's optional stopping theorem/the general fairness of Martingales which state that you can expect no ultimate loss or reward from doing so. That is, I want to say that $\mathbb{E}[X] = \mathbb{E}[X_{100}] = 100$. I am sure (based on gut feeling) that $\mathbb{E}[X] = 100$ but I'm not convinced about my proof of this. The vague appeal to the stopping theorem doesn't satisfy me since $\tau$ the stopping time is just $\tau = 100$ which I'm not sure is allowed.

Markhov Chain/Stationary Distribution: This can also be modelled as a Markhov chain where the probability transition matrix is $\pmatrix{0.5, 0.5\\ 0.5 ,0.5}$. The stationary dist $[\pi_U, \pi_D]$ will then satisfy $0.5\pi_U + 0.5\pi_C = 0.5\pi_U $ and $\pi_U + \pi_D = 1$ which gives $\pi_U = \pi_D = 0.5$, i.e, $\mathbb{E}[X] = 100$ again.

All cases of course suggest that investing is pointless.

Are any of my solutions rigorous enough? If not, where can I make them rigorous or is there a simpler way to see this?

3

There are 3 best solutions below

1
On

The second way is rigorous. The third is a bit strange, because what exactly are states in this process is unclear (we can never return to the state we previously were, as $0.9^n 1.1^m = 1$ doesn't have solutions in natural numbers).

The problem with the first way is that you evaluate "typical" situation, but in this case non-typical situation of having significantly more than half wins contributes a lot into average.

This can be amplified to help intuition: assume that in case of win you get triple amount you had, but you lose everything on loss. Then expectation is tremendous $1.5^{100}$, but you are very likely to just lose everything.

The answer if it is worth to invest depends on if you want to play lottery with small chance of winning very large sum, or not.

0
On

Your three arguments are reasonable assuming independent steps, though your first and third use the wrong words.

Your first actually calculates the median of the distribution, about $60.5$. This is not the expectation, as the benefit of winning more than half the time is greater than the cost losing more than half the time.

Your second is reasonable. You have the Martingale property and each step is "fair".

Your third is not quite right: you have the Markov property in that what happens next depends on where you are and what happened earlier does not add to this, but you do not have a Markov chain on two states, which is what you would need for that transition matrix.

In risk minimisation terms, you are correct that it is best not to invest. But people gamble and buy lottery tickets for entertainment, and this is not a particularly bad bet: the expected loss is $0$, and you are more likely to lose ($69\%$) than win ($31\%$), but if you do win then you are quite likely to win a substantial amount.

0
On

No need for heavy machinery here: you can do this by induction on the expected value, $E_n(X),$ of the value of the stock at the end of day $n.$

Suppose the actual possible values of the stock at the end of day $n$ are given by the $k$ possible values $a_1, a_2, \ldots a_k.$ The expected value at the end of day $n$ is $E_n(X) = \frac{a_1 + \ldots + a_k}{k}. $

Then the actual possible values of the stock at the end of day $(n+1)$ are given by $\ 0.9 a_1,\ 1.1 a_1,\ \ldots,\ 0.9 a_k,\ 1.1 a_k.\ $ The average of these is still $E_n(X).$