I ran across an apparent paradox which I then located in the paper The Box Problem: To Switch or Not to Switch as such:
Imagine that you are shown two identical boxes. You know that one of them contains. \$b and the other \$2b. Picking one at random and opening it, you must decide whether to keep it (and its contents), or exchange it for the other box.
In short, when you find $x dollars in a box, the expected value of the other box is .5*.5x + .5*2x = 1.25x, meaning that it's always better to switch. This appears to violate the symmetry of the problem and the fact that you still know nothing meaningful about either box.
The paper goes another direction with it, talking about how having prior knowledge of expected values gives a more meaningful analysis (along with other discussions). However, what if there's no prior knowledge, and we have the original problem as stated. Can anyone give me some intuition to make sense of this?
EDIT: Someone found this question which asks a slightly different formulation, but an identical problem. The accepted answer there just points to a paper, and I'm having difficulty understanding the paper. It explains away the paradox by noting the expectation is based on an infinite sum, and the value depends on the order the sum is evaluated. I'm not familiar with how the order of a sum can change a value, and I also don't see talking about a different way to evaluate the expectation explains the strange result outlined above. My math understanding is primarily based on reading textbooks as a hobby, and I haven't yet worked up to fully understanding math academic papers, so a simpler explanation would be helpful.
You asked for "intuition to make sense of this" and that's what I hope to provide. I state that there is no paradox and, despite the symmetry of the situation, it IS better to switch.
Suppose you walk into a not-for-profit casino where there is no house advantage and place a double or nothing bet for \$x. There is a 0.5 probability that you will win and recieve \$2x (your stake plus \$x winnings) a 0.5 probability that you will lose and walk away empty-handed. The expected winnings are 0.5*\$2x + 0.5*\$0 = \$x. Thus the game is perfectly 'fair' and there is no statistical indication for whether or not you should play. It's just a matter of how lucky you feel...
However, what if the casino offered to give you back half your winnings if you lose? Then you'd be crazy not to play. You'd have a 0.5 probability of winning and getting \$2x and a 0.5 probability of losing and getting back only \$x/2. The expected winnings now are 0.5*\$2x + 0.5*\$x/2 = \$1.25x. It's just like the problem you describe.
Hopefully one can intuit now that a double-or-halve bet is better than a double-or-nothing, and since the double-or-nothing is a fair bet then the double-or-halve one is clearly stacked in your favour. Hence you should take it if offered and that's why it is always good to swap the boxes in the question as originally posed.
This explanation, of course, is only valid for a constant value of x so cannot be applied to the original formulation of the 'Two Envelopes' question, a discussion of which is beyond the scope of my evening. I added this disclaimer because I'm wary of being accused of providing an irrelevant 'lay perspective'.