• Jeremiah
    1.5k
    Repeating over and over that this is "not a statistics problem" is not a valid response to anything I am doing. That is nothing but blind categorical dismissal.

    I will not consider "this is not a statistics problem" as valid or thoughtful criticism and I will continue to evaluate this by the methods I consider most appropriate.
  • JeffJo
    130
    That's fine with me. In that case, one must be open to embracing both horns of the dilemma, and realize that there being an unconditional expectation 1.25v for switching, whatever value v one might find in the first envelope, isn't logically inconsistent with ...Pierre-Normand
    But it isn't logically consistent. With anything. That's what I keep trying to say over and over.

    1.25v is based on the demonstrably-false assumption that Pr(X=v/2)=Pr(X=v) regardless of what v is. It's like saying that the hypotenuse of every right triangle is 5 because, if the legs were 3 and 4, the hypotenuse would be 5.

    • Exp(other) = (v/2)*Pr(picked higher) + (2v)*Pr(picked lower) is a mathematically incorrect formula, because it uses the probabilities of the wrong events.
    • Exp(other) = (v/2)*Pr(V=v|picked higher) + (2v)*Pr(V=v|picked lower) is the mathematically correct formula, because it uses the probabilities of the correct events.

    The only thing we need to understand to "embrace the dilemma," is why this is so. It isn't simply that one is conditional and one is unconditional, it is that the events used in the first do not represent a value.
  • Srap Tasmaner
    4.6k
    We're given a choice between envelopes valued unequally at a and b. We won't know which one we picked. The expected value of switching is



    Since a and b are both greater than 0, a/b and b/a are both defined. We can, if we like, also say



    or



    for whatever reason.

    For instance, if a/b = 1/2, then

    Then



    or



    Isn't all of this true whichever of a and b is larger, and whatever their ratio?
  • Michael
    14k


    The issue is that we’re approaching the problem in two different ways. See here. Both statements are true. Our concern isn’t with defending one or the other statement but in showing that one or the other is appropriate given the context. Which is the rational position for a single play of the game after opening the envelope?
  • Srap Tasmaner
    4.6k

    If you know only one value, you don't have enough information to prefer either sticking or switching. Flip a coin. That's what you did the first time you chose an envelope, and that's what you're doing now.
  • Michael
    14k
    So we’re assuming that the other envelope is equally likely to contain either £20 or £5, and that’s a reason to switch. We either lose £5 or gain £10. That, to me, is a reasonable gamble.
  • JeffJo
    130
    We're given a choice between envelopes valued unequally at a and b. We won't know which one we picked. The expected value of switching is

    (1/2)(a−b)+(1/2)(b−a)=0

    ...

    Isn't...this true whichever of a and b is larger
    Srap Tasmaner

    Certainly. It is a complicated way of saying that, before you choose, the expected values of the two envelopes are the same. There is even a simpler way to get there: the total amount of money is (a+b), so the expected value of either envelope is (a+b)/2.

    But this applies only if we don't look inside one.

    If we do look, and see v, then we are considering two possible pairs of envelopes, not the one pair you described. The pair is either (a,v) or (v,b), where a<v<b. Now there are two random choices - one for which pair was picked, and one for which envelope in the pair was picked.

    Let Q be the probability that the pair was (a,v), so the probability that it was (v,b) is (1-Q). Before we look:
    1. There is a Q/2 probability that our envelope contains a.
    2. There is a Q/2 + (1-Q)/2 = 1/2 probability that our envelope contains v. But this breaks down into
      1. A Q/2 probability that we have the high envelope, and it is v.
      2. A (1-Q)/2 probability that we have the low envelope, and it is v.
    3. There is a (1-Q)/2 probability that our envelope contains value b.

    If our envelope has a, we are in case 1. If our envelope has b, we are in case 3. But if it has v, we know that we are in case 2 BUT IT COULD BE case 2.1. or case 2.2. This still works out as a 50:50 chance that we picked high or low before we look; that is, cases 1 and 2.2 add up to 1/2, as do cases 2.1and 3.

    But what if we look?
    • If we see a, there is a 100% chance we picked low.
    • If we see b, there is a 100% chance we picked high.
    • If we see v, the chance is Q that we picked high, and (1-Q) that we picked low. We get this by dividing the probabilities in 2.1 and 2.2, by the one in 2.

    The expectation for the other envelope, given that one contains v, is a*Q + b*(1-Q). If a=v/2 and b=2v, this reduces to 2v-3vQ/2. And finally, it is only if Q=1/2 that it reduced further to 5v/4.
  • Pierre-Normand
    2.2k
    But it isn't logically consistent. With anything. That's what I keep trying to say over and over.

    1.25v is based on the demonstrably-false assumption that Pr(X=v/2)=Pr(X=v) regardless of what v is. It's like saying that the hypotenuse of every right triangle is 5 because, if the legs were 3 and 4, the hypotenuse would be 5.
    JeffJo

    What you are saying is correct in any case (most cases?) where the prior probability distribution of the envelope values isn't unbounded and uniform. In the case where it is, then there is no inconsistency between the expected value of the unseen envelope being 1.25v conditionally on the value of the seen envelope being v, and this being the case regardless of which one of the two envelopes has been seen.

    Exp(other) = (v/2)*Pr(picked higher) + (2v)*Pr(picked lower) is a mathematically incorrect formula, because it uses the probabilities of the wrong events.

    Actually, the formula is correct in the special case where the prior distribution is uniform and unbounded, since, in that special case, the conditional probabilities Pr(picked higher|V=v) and Pr(picked lower|V=v) remain exactly 1/2 whatever v might be. In the more general case, the formula should rather be written:

    Exp(other) = (v/2)*Pr(picked higher|V=v) + (2v)*Pr(picked lower|V=v)

    Exp(other) = (v/2)*Pr(V=v|picked higher) + (2v)*Pr(V=v|picked lower) is the mathematically correct formula, because it uses the probabilities of the correct events.

    Are you sure you didn't rather mean to write "Exp(other) = (v/2)*Pr(picked higher|V=v) + (2v)*Pr(picked lower|V=v)"?
  • Pierre-Normand
    2.2k
    Isn't all of this true whichever of a and b is larger, and whatever their ratio?Srap Tasmaner

    Yes, this is true of the unconditional expected values of sticking or switching. But those are not the same as the conditional values of sticking or switching (conditional, that is, on the specific value of one of the two envelopes). In the case where the prior distribution isn't uniform and unbounded, then, the unconditional values of sticking and switching are equal to each other, and they are finite. In the case where the prior distribution is uniform and unbounded, the unconditional values still are equal to each other since they are both infinite. And also, the conditional values of switching or sticking both are equal to 1.25v, conditional on v being the value of any one of the two envelopes. (The trick, of course, is to see why this doesn't lead to a contradiction in the case where the prior distribution is uniform and unbounded. It's the same reason why the guests of the Hilbert Rational Hotel are rationally justified to blindly move to new rooms, and, if they have already moved, are rationally justified to blindly move back to their previous rooms).
  • Pierre-Normand
    2.2k
    So we’re assuming that the other envelope is equally likely to contain either £20 or £5, and that’s a reason to switch. We either lose £5 or gain £10. That, to me, is a reasonable gamble.Michael

    It's not necessarily equally likely. It may be equally likely, conditionally on £10 being the value of the first envelope, but it may also be morel likely that the other envelope contains £20, or more likely that it contains £5. In the first two cases, you may fare better if you switch. In the third case, you fare worse. (It ought to be twice as likely that the other envelope contains £5 rather than £20 for the switching strategy to break even). On (weighted) average, over all the possible values that you can be dealt initially, you don't fare any better by switching. Only in the case where the prior distribution of possible envelope values is uniform and unbounded do you have a positive expected gain from switching conditionally on any value v being seen in your envelope.
  • Jeremiah
    1.5k
    You never need to know the distribution or even guess at it.

    We have for some unknown population. Where is some positive real number on a countable domain.

    Then you have two more variables, the envelopes and , which the contents of have a relationship.

    Which can be displayed with this makeshift table.
         A   B
     1   x  2x
     2   2x  x
    

    1 and 2 represent which case you are in.

    It is important to understand the significance of this relationship,
    as
    and
    which means

    Knowledge of the distribution is not needed, nor is a prior, as the value in envelope , has the same exact chance of occurring as the value in envelope .

    Consider an example were we know the contents of both envlopes

    and

    You get and you see the 10, what you didn't know, however, that there was only a 2% of getting this 10 bucks, which means, due to the relationship there was only a 2% chance that ended up with the 20 it did. This equality will be true in every single case for every possible distribution.

    Now you might be tempted to think well what about the chance of 5 bucks? I have ten the other one may have 5. It won't matter, the equality would hold, it would just be 2% chance for 10 and likewise 2% chance for 5. Whatever the case may be and whatever you think might be in the other envelope that equality will always be true: That is one thing you know and can count on.

    This means and since probability must sum to one and you only have two choices then it is a 50/50 split. Some people have been ignoring this relationship and treating the envelopes as merely two different amounts, but that is not congruent with the OP and it is an error when considering the probability.

    Whatever the chance of the value you see when opening your envelope it has exactly the same chance as the contents of the other envelope.
  • Jeremiah
    1.5k
    This is not a paradox and in fact the solution is straight forward and simple. This is an exercise in observational bias. Separating the objective and subjective without all those mitigating safeguards we are so use to.

    People come up with an idea of how they think it works, then they model their belief and when their model matches their beliefs they decide that confirms their beliefs, it turns into a type of confirmation bias. Obviously if you model your subjective beliefs then your model will confirm your subjective beliefs, that is no paradox. You have to separate the objective process from the subjective process.

    The secret is to realize that is a trap.
  • Michael
    14k
    I know it's not necessarily equally likely. But we're assuming it, hence why Srap said we should just flip a coin.

    If there's no reason to believe that we're more likely to lose than win on switching, i.e. if there's no reason to prefer sticking, and if we can afford to lose, then switching is a good gamble for a single game, even if not a winning strategy over many games. I either lose £5 or I gain £10. That's a bet worth making for me, and so if I were to play this game and find £10 in my envelope then I would switch.
  • Pierre-Normand
    2.2k
    If there's no reason to believe that we're more likely to lose than win on switching, i.e. if there's no reason to prefer sticking, and if we can afford to lose, then switching is a good gamble for a single game, even if not a winning strategy over many games. I either lose £5 or I gain £10. That's a bet worth making for me, and so if I were to play this game and find £10 in my envelope then I would switch.Michael

    I would say that, if it's not a winning strategy over many games of the same kind, then it's not a rational strategy over just one game of that kind; unless you aren't averse to playing money games with negative or null expected value. Playing the game only once merely increases the variance. It changes nothing to the expected value. (Although, as andrewk earlier noted, what choice you ought to make also depends on what your personal utility valuations are; here I am simply assuming that the player's only goal is to act such as to maximize expected value).

    What would make the switching choice worth making would be if the chance of your losing £5 isn't at least twice as large as your chance of winning £10 is. But you don't know that to be the case either. If you naively rely on the principle of indifference, this will lead you to make a mistake in every case where you are playing this game, are being dealt an envelope with value v, and, conditional on the amount in the envelope dealt to you being v, the chance of your losing £5 is more than twice as large as your chance of winning £10. In the other cases, your choice to switch yields a null or positive expected value. The only thing that you know for sure is that, over the long run, such mistakes would exactly cancel out your gains. So, the expected gain from switching, when you have no clue at all where the value v that you are holding is located within the probability distribution of the possible envelope values, is exactly zero. It is not 1.25v.

    Lastly, if you have no clue at all what the distribution is, and you expect any possible distribution to be equally likely with no constraint at all on the maximum or minimum amounts possibly (and plausibly) being contained in the envelopes, then, yes, the expected value of switching, conditionally on v being the value of your envelope, is 1.25v. But that can only happen in a world where the game master has an infinite fund of money.
  • Michael
    14k
    But you're also not saying that sticking is a winning strategy. If sticking isn't preferred then I am going to switch, because I am willing to risk losing £5 for the chance to win £10. I have more to gain than I have to lose. That neither strategy gains over the other after repeated games doesn't change this.

    I am simply assuming that the player's only goal is to act such as to maximize expected valuePierre-Normand

    Right, and if I stick then I'm guaranteed to walk away with £10, whereas if I switch then I either walk away with £5 or £20. £20 is greater than £10, and a greater difference than that between £5 and £10, and so unless I have a reason to believe that £20 is less likely than £5, why wouldn't I switch? Because if I play the game repeatedly (with different values in my chosen envelope) then I don't gain over the person who never switches? That seems like a non sequitur.
  • Michael
    14k
    So what I'm saying is that if a) I have no reason to believe that losing is more likely (specifically >= 2/3 chance), b) I gain twice as much if I win than I lose if I lose, and c) I can afford to lose, then I have a good reason to switch when playing a single game.

    I don't see how the expected value of either always switching or always sticking over repeated games has any bearing on this.
  • Pierre-Normand
    2.2k
    But you're also not saying that sticking is a winning strategy. If sticking isn't preferred then I am going to switch, because I am willing to risk losing £5 for the chance to win £10. I have more to gain than I have to lose. That neither strategy gains over the other after repeated games doesn't change this.Michael

    That only makes sense if you favor taking the chance of gaining a larger amount A than the amount B that you can possibly lose irrespective of their relative probabilities and, hence, irrespective of the expected value of the outcome.

    Suppose I offer you to play a game with two dice. You throw them once and sum them up. If you roll any value from 1 to 11, you must give me £5. If you roll 12 then I must give you £10. Let us assume that we only are going to play this game once. Would you also say, in this case, that you are willing to risk losing £5 for the chance to win £10?
  • Michael
    14k
    Suppose I offer you to play a game with two dice. You throw them once and sum them up. If you roll any value from 1 to 11, you must give me £5. If you roll 12 then I must give you £10. Let us assume that we only are going to play this game once. Would you also say, in this case, that you are willing to risk losing £5 for the chance to win £10?Pierre-Normand

    No, because I know the probabilities aren't in my favour. If I know that they're not in my favour then I won't play. If I know that they're in my favour then I will play. If I don't know the odds then I will play.
  • Pierre-Normand
    2.2k
    No, because I know the probabilities aren't in my favour. If I know that they're not in my favour then I won't play. If I know that they're in my favour then I will play. If I don't know the odds then I will play.Michael

    In the two envelope case, you don't know the odds of winning. But you do know (or ought to be able to deduce) that the odds aren't either in your favor, neither in your disfavor. The expected gains from either switching or sticking both are zero. That is the case, anyway, on the assumption that the game master doesn't have access to infinite funds.
  • Michael
    14k
    Suppose I offer you to play a game with two dice. You throw them once and sum them up. If you roll any value from 1 to 11, you must give me £5. If you roll 12 then I must give you £10. Let us assume that we only are going to play this game once. Would you also say, in this case, that you are willing to risk losing £5 for the chance to win £10?Pierre-Normand

    So to make this a better analogy, let's say that some third party asks us both to play the game. He will roll two dice, and if I win then you give me £10 and if I lose then I give you £5. He doesn't tell us what result counts as a win for me and what counts as a win for you. It could be that 1-11 is a win for you, or it could be that 1-11 is a win for me, or it could be that 1-6 is a win for me.

    I would be willing to play, as I have more to gain than I have to lose. You, presumably, wouldn't be willing to play, as you have more to lose than you have to gain.
  • Michael
    14k
    In the two envelope case, you don't know the odds of winning. But you do know (or ought to be able to deduce) that the odds aren't either in your favor, neither in your disfavorPierre-Normand

    How is this any different to saying that I'm equally likely to win as lose?
  • Pierre-Normand
    2.2k
    How is this any different to saying that I'm equally likely to win as lose?Michael

    It is obviously different since on the assumption that you are equally likely to win as lose it follows that the expected value of switching is 0.25*v whereas saying that the odds neither are nor aren't in your favor is equivalent to saying that the average expected value of switching is v.

    (I'll come back to this conversation tomorrow)
  • Michael
    14k
    It is obviously different since on the assumption that you are equally likely to win as lose it follows that the expected value of switching is 0.25*v whereas saying that the odds neither are nor aren't in your favor is equivalent to saying that your average expected value from switching is v.Pierre-Normand

    I don't disagree that the average expected value (from repeated games) is v. But that's not what I'm talking about. What I'm talking about is what I said here:

    ... if a) I have no reason to believe that losing is more likely (specifically >= 2/3 chance), b) I gain twice as much if I win than I lose if I lose, and c) I can afford to lose, then I have a good reason to switch when playing a single game.

    I don't see how the expected value of either always switching or always sticking over repeated games has any bearing on this.
  • Michael
    14k
    Imagine you're given £100 and are offered the choice to pay £100 to play a game with a 5/6 chance of winning (say a dice roll). If you win then you win £1,000,000 and if you lose then you lose all the money you've won up to that point.

    The average return for repeated games is 0, as you're almost certain to lose at some point. But playing it just once? That's worth it.

    This is why I think talking about average returns over repeated games is a red herring.
  • Jeremiah
    1.5k
    You are only willing to make that traded because it is only 10 bucks. That is not much to lose in the first place. It is a subjective criteria and it does not say anything about the actual probability.

    What would you do if you opened envelope A and saw 1000 bucks?
  • Jeremiah
    1.5k
    There is going to be a subjective risk factor of how much a person will be willing to gamble with. That will vary from person to person and doesn't say anything about the objective gain/loss.
  • Jeremiah
    1.5k
    It's not necessarily equally likelyPierre-Normand
    I know it's not necessarily equally likely.Michael

    Actually it is. Whatever is in the other envelope had the same chance to occur as what is in your envelope. If you have 10 and that had 75% chance to occur then any possible value in the other envelope must also have a 75% chance to occur. If the other envelop has 5 bucks then it has 75% chance to occur. If it is 20 then it has a 75% chance to occur. The envelopes will always share this relationship, always. So whatever you consider as the possible outcome, all considerations are necessarily equally likely
  • Michael
    14k
    You are only willing to make that traded because it is only 10 bucks. That is not much to lose in the first place. It is a subjective criteria and it does not say anything about the actual probability.

    What would you do if you opened envelope A and saw 1000 bucks?
    Jeremiah

    I already said that you have to be willing to lose. If I knew for a fact that I had a 10% chance of losing £1,000,000 and a 90% chance of winning an extra £1,000,000 then I wouldn't bet, despite the greater probability and greater expected value, because I need that first million far more than the second million.
  • Jeremiah
    1.5k
    My point is that your personal cretria is something that we cannot objectivly measure.
  • Michael
    14k
    My point is that your personal cretria is something that we cannot objectivly measureJeremiah

    That there's more to gain than there is to lose is what can be objectively measured. Either the envelopes contain £5 and £10 and I lose £5 by switching or the envelopes contain £10 and £20 and I gain £10 by switching.

    That I'm willing to bet that it's the latter is, of course, a subjective matter. I have no reason to believe that the former is more likely, and as a risk of £5 is acceptable to me, switching is a no-brainer.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.