## Mathematical Conundrum or Not? Number Six

• 1.5k
Not sure I can find another conundrum that will generate as much discussion as our last one; however, it is time to move on.

So this one is called the Two Envelopes Paradox and this is how it goes:

You are playing a game for money. There are two envelopes on a table.
You know that one contains $X and the other$2X, [but you do not
know which envelope is which or what the number X is]. Initially you
are allowed to pick one of the envelopes, to open it, and see that it
contains $Y . You then have a choice: walk away with the$Y or return
the envelope to the table and walk away with whatever is in the other
envelope. What should you do?

What should you do?
• 7.6k
By £2X do you mean twice what’s in the other envelope?
• 1.5k

Yes.
• 7.6k
OK, so the initial answer is that it doesn’t matter as there’s a 50% chance of having picked the more valuable envelope and switching doesn’t change the odds.

The paradox supposedly arises when you consider that switching into the more valuable envelope doubles your winnings, whereas switching into the less valuable envelope halves your winnings, so there’s more to gain than there is to lose, and as each is equally likely, switching is the better choice.
• 7.6k
As a bet it seems like a 2:1 payout? You have £10. You bet £5 on heads. If you win you get £15 back. And you're betting on a coin toss. Is that right?
• 1.5k

I am not sure that is quite right, as you have added information that was not in the OP. In your example you have a starting amount that is known, but in our case you don't know your starting amount.

You could have X or 2X. If you have X and you switch then you get 2X but lose X so you gain X; so you get a +1 X. However, if you have 2X and switch then you gain X and lose 2X; so you get a -1 X.
• 7.6k
I am not sure that is quite right, as you have added information that was not in the OP. In your example you have a starting amount that is known, but in our case you don't know your starting amount.

You bet $\frac{x}{2}$ on heads. If you win you get $\frac{3x}{2}$ back. The odds of winning are $\frac{1}{2}$.
• 7.6k
You could have X or 2X. If you have X and you switch then you get 2X but lose X so you gain X; so you get a +1 X. However, if you have 2X and switch then you gain X and lose 2X; so you get a -1 X.

The amount you have is $x$. The other envelope contains either $2x$ or $\frac{x}{2}$. If it's $2x$ then you gain $x$ by switching. If it's $\frac{x}{2}$ then you lose $\frac{x}{2}$ by switching.
• 1.5k
You just said the same exact thing I did, as it can only be half X if you start with 2X. Which still results with a -1 X.
• 1.5k
Your expected gain and loss is tied to the uncertainty of what you start with.
• 7.6k
You just said the same exact thing i did

It's not. Mine is closer to how one would understand it were we to look in the envelope before deciding.

If we open the envelope and see £10 then we know that the other envelope contains either £5 or £20. By switching there's a 50% chance of losing £5 and a 50% chance of gaining £10. Switching seems like the better option.

Am I right with this reasoning? If so, why would not looking first make a difference? The odds are 50% that the other envelope contains twice as much whether we look or don't.
• 1.5k
It's not. Mine is closer to how one would understand it were we to look in the envelope before deciding.

It is the same thing, the math comes out exactly the same.
• 7.6k
It is the same thing, the math comes out exactly the same.

It's not the same thing.

I'm saying that if I have £10 then I either lose £5 by switching or gain £10.

You're saying that if I have £10 and the other envelope contains £5 then I lose £5 by switching and that if I have £5 and the other envelope contains £10 then I gain £5 by switching.

Notice that in my example it's better to switch whereas in your example it isn't.
• 7.6k
So there are two different ways to describe the situation, each leading to a different conclusion:

1. I have $y$ and the other envelope contains either $2y$ or $\frac{y}{2}$.

2. Either I have $x$ and the other envelope contains $2x$ or I have $2x$ and the other envelope contains $x$.
• 1.5k
I'm saying that if I have £10 then I either lose £5 by switching or gain £10.

There are two possibilities in this case either X = 5 or X = 10

Case one X = 5

If you have 2X and switch then you are left with 5 bucks. You had 10 and now you have 5. You got a -X.

Case two X = 10

If you have X and switch then you are left with 20 bucks. You had 10 and now you have 20. You got a +X.

Exactly what I said before.
• 7.6k

I have £10. If X is 5 then I lose £X by switching. If X is 10 then I gain £X by switching.

So it's either -£X or +£X. This is symmetrical.

My reasoning is:

I have £10. If X is 5 then I lose £5 by switching. If X is 10 then I gain £10 by switching.

So it's either -£5 or +£10. This is not symmetrical.
• 1.5k
My reasoning is:

I have £10. If X is 5 then I lose £5 by switching. If X is 10 then I gain £10 by switching.

So it's either -5 or +10. This is not the same.

Look here:

Case one X = 5

If you have 2X and switch then you are left with 5 bucks. You had 10 and now you have 5. You got a -X.

Case two X = 10

If you have X and switch then you are left with 20 bucks. You had 10 and now you have 20. You got a +X.

Exactly what I said before.

Case one X = 5. You lose 5 bucks.

You had 10 and now you have 5. Means you lost 5 and in that case X = 5. Therefore you got a -X.

Case two X = 10. You gain 10 bucks.

You had 10 and now you have 20. Means you gained 10 and in that case X=10. Therefore you got a +X

The same outcome as you.

You have to understand that X is a variable.
• 1.5k
The fact that X is a variable is what makes this different, as even if you know the amount in one envelope you still don't know if that amount it is X or 2X. So I guess the question is: Does the new information of an amount shift the odds at all, given that you still don't know if it is X or 2X?
• 7.6k
Case one x = 5. You lose 5 bucks.

You had 10 and now you have 5. Means you lost 5 and in that case X = 5. Therefore you got a -X.

Case two x = 10. You gain 10 bucks.

You had 10 and now you have 20. Means you gained 10 and in that case X=10. There you got a +X

The same outcome as you.

Yes, and the X that you lose is £5 and the X that you gain is £10. Therefore, there's more to gain than there is to lose.
• 1.5k
Only if you have X.
• 7.6k
Only if you have X.

Obviously you'll only win if you have X. My point is that by switching there's a 50% chance of gaining an extra £10 and a 50% chance of losing £5. Those odds favour a switch.
• 7.6k
Here's a program to show what I mean:

<?php

$switch =$no_switch = 0;

for ($i = 1;$i <= 1000000; ++$i) { // Our envelope contains £10$chosen = 10;

// The other envelope contains either £5 or £20
$other = random_int(0, 1) ? 5 : 20; // If we switch$switch += $other; // If we don't switch$no_switch += $chosen; } echo 'Switch: £' . number_format($switch) . PHP_EOL;
echo 'No Switch: £' . number_format(\$no_switch);


http://sandbox.onlinephpfunctions.com/code/b1d89ae10c4c5a8c3988bcf1112bb0b4f4ed8254

The result after 1,000,000 games:

Switch: £12,500,795
No Switch: £10,000,000
• 7.3k

Two envelopes are e.g. 5 (X) and 10 (2X) - Pick 10(2X) and switch get X - But if you've already picked 2X you cannot get 2(2X). There is no 4X (20). That's no longer a possibility. Once you pick once, you eliminate either a double or a halving. Your array presumes three possibilities 5, 10, and 20. That contradicts the description in the OP. Right?
• 1.5k

Ya, I am moving that way, I am working on it now.
• 7.6k
Two envelopes are e.g. 5 (X) and 10 (2X) - Pick 10(2X) and switch get X - But if you've already picked 2X you cannot get 2(2X). There is no 4X (20). That's no longer a possibility. Once you pick once, you eliminate either a double or a halving. Your array presumes three possibilities 5, 10, and 20. That contradicts the description in the OP. Right?

If your envelope contains £10 then the other envelope contains either £5 or £20 (picked at random for each game).
• 7.3k

If the envelope contains 10 then the original pair must have been either A(5 and 10) or B(10 and 20) according to the OP description. If it was A(5 and 10) then switching gives you 5. There is never any possibility of 20. If it is B(10 and 20) there is never any possibility of 5. The only combinations given that you have a 10 are A or B. Therefore picking 10 is incompatible with both a possibility of 5 and of 20.

Therefore your program which presumes it is is incompatible with the OP.
• 1.5k

I don't think that is quite the right way to look at this.

Let's try mapping this out and go back to our cases which I hope we are on the same page now.

Case one X = 5

If you have 2X and switch then you are left with 5 bucks. You had 10 and now you have 5. You got a -X.

Case two X = 10

If you have X and switch then you are left with 20 bucks. You had 10 and now you have 20. You got a +X.

Let's call case one L and case two K. Where L is the event you start with 2X and K is the event you start with X. These are two different events.

You open it up and see you have 10 bucks:

In the event of L if you switch then you gain -X, which in this case is a loss of 5 bucks. If you don't switch then you gain 0.

In the event of K if you switch then you gain +X, which is gain of 10. If you don't switch then you gain 0.

These are two different events.

Now let's do it in the 1/2 terms.

In the event of L you get -1/2X, which would be a loss of 5 bucks. If you don't switch then you gain 0.

In the event of K if you switch you get 2X, which would be a gain of 10 bucks. If you don't switch then you gain 0.
• 7.6k

There are two equally likely scenarios (say the experimenter tosses a coin to determine which scenario to set up):

1. One envelope contains £10 and the other envelope contains £5.
2. One envelope contains £10 and the other envelope contains £20.

I open my envelope and find £10.

There's a 50% chance that the other envelope contains £5, and so a 50% chance that if I switch then I lose £5 (I did have £10, now I have £5).

There's a 50% chance that the other envelope contains £20, and so a 50% chance that if I switch then I gain £10 (I did have £10, now I have £20).

So switching gives me a 50% chance of gaining £10 and a 50% chance of losing £5. There's more to gain by switching than there is to lose.
• 7.6k
In the event of L if you switch then you gain -X, which in this case is a lose of 5 bucks. If you don't switch then you gain 0.

In the event of K if you switch then you gain +X, which is gain of 10. If you don't switch then you gain 0.

So by switching you either lose £5 or you gain £10 (each equally likely). There's more to gain by switching than there is to lose. Therefore it's better to switch.
• 7.3k

Not sure if you saw my edit:

The only combinations given that you have a 10 are A or B. Therefore picking 10 is incompatible with both a possibility of 5 and of 20. Therefore your program which presumes it is is incompatible with the OP.

So switching gives me a 50% chance of gaining £10 and a 50% chance of losing £5. There's more to gain by switching than there is to lose.

No, you never get to that position. Switching gives you a 100% chance of gaining £10 iff you are in scenario two and 100% of losing £5 iff you are in scenario one. You can never be in both. That's impossible by the time you get to seeing the £10. The fact that you don't know which scenario you are in needs to be separated out from the actual possibilities available to you at that time which are dependent on the reality of the scenario that applies at that time.
• 7.6k
Therefore picking 10 is incompatible with both a possibility of 5 and of 20.

I don't know what you mean by this.

No, you never get to that position. Switching gives you a 100% chance of gaining £10 iff you are in scenario two and 100% of losing £5 iff you are in scenario one. You can never be in both. That's impossible by the time you get to seeing the £10. The fact that you don't know which scenario you are in needs to be separated out from the actual possibilities available to you at that time which are dependent on the reality of the scenario that applies at that time.

I don't accept this interpretation of probability. Say you toss a coin and if it's heads you put a blue ball in a box and if it's tails you put a red ball in a box. You give me the box and ask me to guess if it's blue or red (and I know the rules).

According to your reasoning, all I can say is that if it was heads then there's a 100% chance of a blue ball and if it was tails then there's a 100% chance of a red ball.

Whereas I'd say that there's a 50% chance of a blue ball and a 50% chance of a red ball.

But we can amend the OP to account for your kind of interpretation. You say that you are going to toss a coin and if it's heads then you will put £5 in an envelope and if it's tails then you will put £20 in an envelope. You give me the option of buying this envelope for £10 before you toss the coin. What should I do? I would buy the envelope, because there's a 50% chance of me gaining £10 and a 50% chance of me losing £5, so there's more to gain than lose.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal