Markov Chain: Optimal stopping to determine the price at which stock is traded

157 Views Asked by At

The stock price starts at 100\$. At any given time, there is 50% probability that stock price increases further by 1 and 50% probability that stock price goes back to 100\$. You are paying 1\$ to trade. What price should you trade this stock if it starts moving upwards?

I tried modeling this problem as markov chain but was soon lost due to lack of experience. Any help will be appreciated.

2

There are 2 best solutions below

0
On

You should never do anything. Each time you pay \$1 and it either goes up by \$1 (in which case you really haven't gained anything), or goes to \$100 from above (this could be no change, if at \$100 already, but could be a loss). So either you get no benefit, or you get a loss.

0
On

I assume the "$1 to trade" means you only pay when you actually trade (i.e. sell the stock at its current price). You dont pay anything to simply hold the stock, and there is no depreciation / net present value kind of consideration.

Under this model I think the question is ill-defined. If you wait infinitely long, the stock price will reach any arbitrarily high value. (Equivalently, if you keep flipping a fair coin infinitely many times, you will see arbitrarily long streaks of Heads.) So if you want to sell at say $1 billion, simply wait really, REALLY long until the price reaches \$1 billion, then sell. It will eventually happen, and the model of the question does not penalize such waiting. Alas, that is not the optimal solution since another strategy to wait till \$2 billion will beat it (after waiting even longer).