I am having a bit of trouble wrapping my head around this simple game concept I made up. It goes like this:
You draw a card, lay it face up. I flip a coin. If it's heads, I draw a card and lay it face up. If it's tails, I draw a card and lay it face down, without looking at it.
If I got heads and my card is higher than yours, I win. If I got heads and mine is lower than or equal to yours, it's a draw. If I got tails, I can choose to flip my card face up, and if it's higher than yours, I win. If it's lower, I lose, and if it's the same, it's a draw. If I choose not to flip the card face up, the game is a draw.
Suits don't matter, and card values go from Ace being the lowest to King being the highest. What are the probabilities that I win, lose, and draw, assuming I play optimally?
So far, I am pretty sure that if I got heads, I have a 4/17 chance of winning, no chance of losing, and 5/17 of drawing, counting the 1/2 chance of getting heads. However, what's confusing me is the part about getting tails and being able to choose whether I want to flip the card face up or not, as I would base that decision off of what your card is, right (assuming I want to win, and draw if I think I can't)? How would that play into the probability if I make the optimal decision? Thanks.
UPDATE: Thanks to the lovely comments, I understood that I needed an optimal strategy to determine what the probabilities would be. To do this, I created functions, that take in an argument x, which represents the card value that your card has to be lower than or equal to in order for me to flip my card, if I got tails. I worked out what the functions would be in this game and plotted on desmos:
The red function is the probability of winning, blue is losing, and green is a draw. Keep in mind that the sum of those three functions for any x will always be 1 (I used this to check that my functions were correct). Now, assuming a win is worth 1 point, a draw is worth 0, and a loss is -1 point, we want to maximize the difference between the winning and losing, and don't care about draws, as they don't have any effect. So the purple function represents exactly that. Interestingly, it seems that the most optimal strategy would be to flip if the card is lower than or equal to 3 or 4. (the vertex is at 3.5 but obviously that doesn't make sense in this context and the parabola is symmetrical so 3 and 4 are the same) This kind of surprised me, as I thought the critical card value would be 6 or 7. But anyways, this would mean that when played optimally, the probability of winning would be 72/221 (or 76/221), losing would be 4/221 (or 8/221), and drawing would be 145/221 (or 137/221). Super cool! Thanks everyone, I learned a lot during this.