Action plan to solve the problem Markov decision-making models

72 Views Asked by At

I have a task on Markov decision models. I have no idea how to solve this. Perhaps there are some similar typical tasks or approaches.

Text:

An electrical goods store with the aim of immediately meeting demand can make an order for the supply of refrigerators daily. Each order results in a cost of 100 dollars. Storage of one refrigerator during the day costs 5 dollars. The loss of the store in case of unsatisfied demand is estimated at 150 dollars. for each fridge. The probabilities of demand equal to 0, 1 and 2 refrigerators are 0.2, 0.5 and 0.3. The internal areas of the store do not allow placing more than two refrigerators.

Determine the optimal strategy for placing orders for refrigerators for the next three months.

The task must be solved using Markov decision processes. Can anyone suggest a plan of action or show something similar?

1

There are 1 best solutions below

0
On

The key here is in the definition of the state variable. For problems like this the state variable is usually defined as the inventory level (number of items in inventory).

Since this seems like an assignment for a class, I won't write out the MDP. But there are lots of similar models out there; try searching for "inventory finite-horizon MDP" (or dynamic program). Here's one example, starting on page 5. Or here, starting on page 327.

Also, note that a Markov decision process is different from a Markov process; you might be getting these confused.