The New York Times has an interesting series of tests and explanations that show why and how the human brain makes errors in estimating probability—and consequently, why we get suckered even if we think we’re overall pretty smart. To start things off, play the Times’ online version of the “Let’s Make a Deal” game, where you pick one of three doors, then you can read up on how it works. The game brought in a bunch of reader responses (and arguments), so the author, John Tierney, offered a few more thought experiments you can try if you need something to keep you distracted from your job. In today’s column, Tierney talks about why so many people naturally make errors with probability and gets a plain-English explanation from a couple of marketing and psychology experts.

I’m proud to say I instinctively reasoned the Monty Hall game correctly, which is a surprise since I have a naturally ability to screw up pretty much any probability question thrown my way.

Part I: “The Monty Hall Problem” [New York Times]

“Interactive Monty Hall Game” [New York Times]

Part II: “Monty Hall’s Other Problems” [New York Times]

Part III: “The Psychology of Getting Suckered” [New York Times]

(Image: New York Times)

the game doesn’t make any sense in switching. the odds don’t change just because they open a door. now if there were more than three doors, i might agree, but your only other choice is one door. how is that a choice? if i stay with the same door, my odds are still 2 out of three. i understand what the reasoning behind what they are saying, i just think it doesn’t apply when there are only 3 doors, then there are only 2 doors, you don’t really have a choice.

as a math major, i call bull shit on this. there is no right or wrong on this game.

This is a tricky problem. Switching is ALWAYS right. If you don’t switch, you only win when you were right, 1/3. If you DO switch, you lose whenever you were initially WRONG. Subtle, but true.

Snoop -

It is because you only have a 33% chance of picking the right door the first time, and a 66% chance of picking the wrong door. The reason this works is because the host knows a “wrong” door to open, so it becomes statistically smart to change your door. You only had a 1 in 3 chance of picking the “right” door the first time, so by switching (Only when the guy opening the door knows which one has the goat) you get on the better side of that equation. Note that this is NOT the case if the host doesn’t know where the prize is, and just opens one randomly.

@snoop-blog:

Yup, no right or wrong, just what gives you the highest statistical chance of getting the prize. It is still a gamble, but a smarter gamble is one with the odds in your favor, no?

This game was actually in the movie

21. Here is some more info:[en.wikipedia.org]

I can’t figure out why he’s right, but he certainly seems to be from my trial. I switched 15 times and got the car 87% of the time. I stayed 11 times and got the car 36% of the time. I realize I didn’t have enough trials, but it was very obvious from the beginning. Mathematically, I can’t figure it out, even after reading the article twice.

PLEASE read the article before you “call bull shit”. PLEASE think before you write your 1-line, no-capitalization recap of why a long-proven mathematical and statistical concept is false. This is an interesting, if not instinctively obvious theorum that has stood the test of time.

When you first selected a door, you had a 1/3 chance of being right and a 2/3 chance of being wrong. After the other door is opened to reveal the “goat”, you now know that your door still has a 1/3 chance of being right, but now instead of there being 2 wrong doors (which total the other 66.7% chance of correctness), there is only 1 other door – which by itself has a 66.7% chance of being the right door.

The math not only works, but if you had READ the article, OR tried the exmaple, you would see that it is statistically and mathematically true.

I hate people who dismiss things like this without giving it some thought – use your brain for a minute.

/rant

@BloggyMcBlogBlog: ha you beat me to it

@TylerE: Don’t you mean you lose when you were initially right?

@MrFalcon: So if you had a 33% chance of picking the right door, the first time, what is your chance of switching and winning? It’s obviously not 50/50 per this article, so what is it?

@mdkiff: the one line was an add-on to the first comment dick. and i refuse to capitalize on blogs. this isn’t a gd term paper, nor a book. attacking commenters on here is a troll move. you notice how@MrFalcon: manage to show a little class, something you’ll never have.

Wow, I never thought I’d see the Monty Hall problem on Consumerist.

@snoop-blog: As a math major myself… you’re wrong.

@snoop-blog: Too much wake n bake maybe?

@snoop-blog: Did you fail math class?

@qwickone:

Stated above by mdkiff, your chance at picking the “right” door by switching is 66% As the article states, the trick in all this is that people exclude the option that is shown to not be right, and then they don’t think it matters anymore. Again, so long as the person opening a door to show the “goat” knows that it doesn’t include the “prize” then you have a better chance at winning by switching. If you stay with your first pick, you only have a 33% chance at winning. Can you still win by staying? Yes, but not as often as by switching.

Zonk

The Monty Hall game is flawed in one important way; it assumes that you want a car over the goat. For me, I am a big goat fan.

The best way to think about this problem is not with doors but with ‘groups of doors’. Say the door you pick is Group A, and the other 2 doors are Group B. Now if I ask you which group probably has the car, you would automatically think Group B since there are more doors there and thus a higher probability. When I reveal some the doors behind Group B, it doesn’t change the fact that Group B is still better statistically no matter what I reveal about the items within that group.

Reader Jim’s response for the case with 10 doors (say you pick door 1), Monty reveals 7 goats, and you have the option of switching to 5 or 8:

If my math is correct, that means Jim’s initial door has a -10% chance in order for total to add to 100%.

@snoop-blog:

there is no right or wrong on this game.Ding! Ding! Ding! First poster out of the gate and we already have a sucker!

@qwickone: It’s 33.33333% vs 66.66666%. If you switch, you are actually picking the other 2 doors without fear of losing if one of those 2 happens to be wrong.

even if your chances of success are %66, there’s still a chance of being wrong was all i was saying. i understand the stats.

@snoop-blog: @MrFalcon:

Yup, no right or wrong, just what gives you the highest statistical chance of getting the prize.Ooh maybe I’m the sucker. I take back my snarkyness, snoop-blog, on account of a linguistic technicality. I’ll give you the benefit of the doubt.

Please accept my humble apology. You are a scholar and a gentleman.

@snoop-blog: Your first comment:

If I recall Monty Hall, though, he would offer and even encourage the switch, no matter what prize was where. He wasn’t steering you to the car or the goat, as much as making you worry as to whether you’d get car or goat.

With 3 doors, your odds are ALWAYS 50% after one door is revealed.

Initially, you have 3 doors, you need to pick one. The chance that the door you picked has a car is 1/3 (33%)

Now, you have been shown that one of the doors is NOT a car. So, you have your original choice or you can switch.

At this point, having one door revealed, there are only two doors left. The one you picked, and the other unknown door. The car is behind one of those two, so your original “1 in 3″ has been reduced to “1 in 2″ (You’re not going to pick the door that was revealed – it is taken out of the equation)

If you stick with your original choice now, you have a 50-50 chance.

If you pick the other door, you still have a 50-50 chance.

The car is behind one of the two doors, it doesn’t matter what you picked before – you are given a choice to stick with your original choice or pick the other one – in essense you are given a chance to “choose again” in a 50-50 environment.

if you go with the %66, you may be unlucky and still lose, so is this more about luck, in a completely random game, absolutely.

“the game doesn’t make any sense in switching. the odds don’t change just because they open a door.”I used to agree with the doubters. But the problem I had is thinking that the selection was random, it is

notrandom.First, when they open that first door they never open your door. Thus, the choice to open that first door was not random.

Second, when they open that first door, they never open a door with the prise in it. Thus, once again, it is not random.

Once you realize, it makes sense. And even if it does not make sense, as qwickone, points out. You can verify it empirically.

@Dooley:exactly the way i understood it, but the article with the game disagrees with you 50-50.

my only point is the whole thing is luck. on a game like this your always better to stick with your gut (put yourself in a real game like this). if you chose right the first time, but changed your answer, then found out your first choice was right, you’d be kicking yourself saying “why didn’t i just stick with my gut”. i doubt you would be like “oh well, at least i picked the best door statistically.”

@Imafish: yeah it wasn’t until after my first post that i realized the game was not random.

Have you ever heard – the grass is always greener on the other side! It’s not! As dumb as it may sound those old proverbs make sense most of the time…

What if I wanted a goat?

“i doubt you would be like “oh well, at least i picked the best door statistically.””But if you did pick the “best door statistically” you’d be less likely to kick yourself in the first place. So I’d go with that.

In fact, I can’t think of a reason to go against statistics in a game of chance other than a desire to lose.

@snoop-blog:

“my only point is the whole thing is luck. on a game like this your always better to stick with your gut (put yourself in a real game like this). if you chose right the first time, but changed your answer, then found out your first choice was right, you’d be kicking yourself saying “why didn’t i just stick with my gut”. i doubt you would be like “oh well, at least i picked the best door statistically.””

Hi, I work for the Bellagio. We’d really LOVE to have you come visit our casino. A lot. Really, mortgage your house and come here. Please.

Oh my, this new image feature sure is nifty.

I’m not exactly going into a maths-heavy field, so I took fairly elementary math courses. My professor turned it into a “how to think like a sensible person” course and spent the final month, no exaggeration, talking about probability.

She printed off erroneous Ask Marilyn columns from Parade magazine and spent entire lectures correcting them, made us act out “Let’s Make a Deal” in class, and gave us hours of homework that forced us to apply the probability questions to our everyday lives. I hated her as a person, but that professor did us a big favor. She was right, probability makes us look really stupid.

@snoop-blog: Unfortunately when considering probabilities intuition can lead you astray. It is this kind of counter intuitive nature of probability that con artists and hucksters take advantage of.

Here is another example that is similar to the Monty Hall problem that people tend to get incorrect. I have 3 cards, one card is red on both sides, one that is black on both sides, and one that is black on one and white on the other. I take a card randomly and place it on the table, it shows red.

What is the probability the other side is red?

I tell you well it can be the red/red card or the red/black card, so its 50/50 a fair bet. I bet red, you bet black.

In reality the odds are in my favor. In fact the probability the other side is red is 2/3. It could be the red/red card, it could be the red/red card flipped, or the red/black card. The flipped red/red card is a distinct probability but most people don’t intuitively catch that.

@Imafish: unless you consider yourself unlucky

@KernelM: math major =/= genius. i’ll be the first to admit i didn’t get every math problem correct nor did i graduate top of my class. all i was saying is that i understand math naturally unlike english, and to me it was pure luck no matter how you slice it. if it wasn’t luck, then you’d be able to be correct %100 of the time.

@snoop-blog: Crazy mathematicians might ;-)

@LS1Andrew: Is that remaining 0.00001% chance reserved for the goat terrorist group that will bust in, kill everyone and burn the studio down to release the goats into the forest?

i understand if you played over and over again what would happen, but if you only have one shot at it, it’s pure luck.

@Dooley: This is where the person who opens the door having knowledge of where the prize is comes into play. If one door was randomly removed from the choosing, and nobody knew whether it had the car or the goat in it, then yeah…your gut call would be just as good to stay as to switch. You wouldn’t have any statistical advantage to switching, or to staying. However, because someone with knowledge of the results is showing then eliminating a “wrong” choice, that is what keeps it scenario where you’re choosing 1 out of 3, and not a new scenario where you’re choosing 1 out of 2. Knowledge, as is so often the case, is the differentiating factor here.

The explanation is pretty simple. You have a 2/3’s probability of picking a door with a goat behind it (when you start the game).

Therefore, you should assume that you will have picked a loser door each time, because you have 2/3 probability of picking a loser door and only 1/3 probability of picking a winner.

The host will always reveal a door with a goat behind it.

If you picked a loser door and the host reveals the other loser door, then it follows you should switch to the remaining door because it must be the winning one.

This only works if you picked one of the losing doors. However, because you have a 2/3’s chance of picking the losing door, you will, on average, win 2/3’s of the time with this technique.

@snoop-blog: See my above post ;)

@Dooley: Note that Monty can only open either of two of the three doors: he will never open the one you reserved.

So if pick a goat to start with, Monty will always reveal the other goat. By switching, you’ll win every time you picked a goat to start, i.e. 2 out of 3 times.

a rather common Bayes’ Law example

@Dooley: Actually, that other door does matter.

Let me give you an analogous situation, and see if you still think that.

Instead of playing Lets Make a Deal, I take out a deck of cards, a standard deck of 52, and I ask you to pick out the Ace of Spades from the deck. You choose a card.

Now that you’ve chosen a card, I show you 50 of the 52 cards, all not the Ace of Spades. And now I leave you with a proposition: stay with your card, or switch.

The probabilities on that decision isn’t anywhere near 50/50. In fact, it’s a near certainty(51/52) that the card I’m allowing you to switch to is the Ace of Spades. This is because the choice isn’t really “this card or that card”, but how confident you were of your initial guess. You chose that initial card out of 52, and your final choice is whether you think you chose the right card initially. In Monte Hall, your probability of being right is 1/3. In this game, it’s 1/52.

If you alter the game slightly, if you make it to where the person revealing the doors doesn’t know where the prize is, or you start with 3 doors, you show them a loser, and then you let them pick, you’ll end up back at 50/50.

But the choice in the Lets Make a Deal game isn’t Door 1, which you chose earlier, or Door 2, but is instead Door 1 or Not Door 1. Staying with your door is saying you think you made the right decision on your initial guess, which had a 33% probability that you were right.

It sounds counter-intuitive, I know, but run a simulation of it, and you’ll see it work out. You win by switching 2/3rds of the time, and win by staying 1/3rd of the time.

“i understand if you played over and over again what would happen, but if you only have one shot at it, it’s pure luck.”Nope, each and every time you have a better chance to win by switching than to staying put. That does not mean you will win every time you switch, only that your chances double every time you switch.

Of course the problem most of us have with this is that Monty never had 2 bad doors….maybe one bad door or two okay ones.

@larry_y: Pfffsssh. Bayes’ Law is soooo lame. I like totally can’t believe you brought that up.

a better chance does not equal win.

@JustAGuy2: well you could imagine the only game i play is craps. i’m actually really good at it.

@Mollyg: I’m with you.

Ah okay. So it seems like the problem reduces to a 50/50 chance, which is what it would be if you just entered the game after a door opens. But there’s still information in the system from your previous choice which affects the outcome, so you’re still betting on the 2/3 game.

Monty Hall, you are clever. Me, I am not clever.

@snoop-blog: This is a common psychological element that come into play in these kinds of scenarios. People tend to act risky when the outcome has a positive benefit, but the other outcome isn’t too negative. However, when people are presented with a very negative outcome and a neutral alternative outcome, they will tend to be risk adverse and choose a strategy that is more probabilistic.

For example instead of a car and a goat. I present you with a goat, vs being tortured horribly for the rest of your life. In this scenario, you would choose the higher probability option, since the negative outcome is very undesirable.

@snoop-blog: Craps is in fact a near-even game with regard to the odds. If you play correctly, your odds are about 49% vs 51% in favor of the house.

Blackjack is the only other game where it is possible to gain an advantage over the house by counting cards. Of course casinos don’t like to lose money and kick people like this out for ‘trespassing’.

“a better chance does not equal win.”Exactly, what it does mean is that you have a better chance of actually winning. To put it another way, if you go with your gut, you lose more often. That’s why this posting is entitled “Why You Fall For Dumb Things.” Going with your gut versus going with proven statistics in a game of chance is necessarily a “dumb thing.”

@satoru: But what if you’re into goats *and* torture? Is there a way to win both?

@Imafish: This is incorrect. Each individual instance of the game you have a 66% chance of winning if you switch. If you play more than once, the previous instances do not affect the subsequent probabilities.

This is different from say black jack where prior events have an impact on future probability.

@MrFalcon: You’ll note that in my example I did not specifically state WHICH outcome was negative. Merely that you would be more conservative in your choice because one outcome was negative :P

Thus if you are into goats and/or S&M you can take the probabilities any way you want :)

@satoru: my only point is that %33 vs %66 doesn’t mean the %33 is wrong. if you were on a game show faced with this game, how would could you be for sure it isn’t the first door, seeing how we don’t have a history to base any kind of odds on.

let me word this better, if you were betting on roulette, and you had no knowledge of what the previous spins were, it puts you at a disadvantage then the other tables that told you what the last 10 spins were.

if you witness 6-7 people before you change doors, and win, do you still go with your %66 or think the odds are considerably in favor of the %33 now. does that make any sense?

@satoru: thank you!!!!!!!!!!!!!!

@Applekid: Yeah, I just didn’t want to be the one to say it.

As a side note I have always wondered on “Deal or No Deal” if they ask people if they have heard about the Monty Hall Problem? Would these people be disqualified from being a contestant? Obviously they don’t want you to win money, just put on a good show for ratings. Thus it would theoretically be in their interest to remove rational individuals from the pool selection.

Obviously if you had say $1 million and $1 left in the game, you don’t want your contestant to go “OH I’m totally switching because I have a 29/30 chance of winning”.

Incidentally the game show continually perpetuates the incorrect probabilities on the show by stating in bold that “There is a 20% chance Contestant A has a $1 million case”. I think this is irresponsible and just further confuses people on how probability really works.

The trick to make it easier is not to work out the win states, but to work out the lose states. For some reason that makes people understand the odds better (this applies to other things as well like craps, blackjack, etc.). Switching loses 1/3 of the time. Since there is no “tie” state known, you can assume you will win the rest of the time. Seems better to switch in that light even if you don’t understand how to understand the win states.

@snoop-blog: The roulette example is in fact what is referred to as the “Gamblers’ Fallacy”. Every event in roulette is an individual probability. Thus having prior spin knowledge does in no way influence future outcomes. Though the table loudly displays the previous spins, they in fact are totally irrelevant information. It is similar to how people think that say I am flipping a coin and I come up with a run HHHHHHHHHH (10 heads). With your logic, you would think that tails is more likely in the next throw. In reality the previous 10 heads, while a very unlikely event and peculiar for sure, does not affect the next throw. Thus I can say that there is a 50% chance of getting heads on the next throw and 50% of getting tails.

As I mentioned before, the only casino game where previous rounds affect future probability is black jack. There might be a few others, but I can’t think of any at the moment.

The Monty Hall problem surfaced on The Straight Dope years ago, and they basically hashed out all of the same arguments I see here. At the time, Marilyn Vos Savant(?) who is(was?) one of the smartest people in the world, answered this question saying you should switch, and people flipped out over it, saying she was wrong.

As I remember reading in the book of collected Straight Dope columns, there were about 2-3 columns worth that were dedicated to discussing this problem.

If I were to play that game, I would choose 2 doors, then say “open doors #1 and #3. You will now open both of the other doors, and if there is a car behind either of them I win.”

Monty Hall would say “No, you have to choose a single door” so I’d pick door #2. Then he’d open one of the other 2 doors, whichever doesn’t have a car. Then I would say “ha ha, I have tricked you Monty Hall, for I had no intention to choose door #2. I chose doors #1 and #3. You have opened one of them, knowing that there was no car. Now I shall tell you to ‘switch the door I picked’ but really I will have you open the second of the doors I chose. You see Mr. Hall, I am the one with the power, and if we’re going to play this game, we will play it by my rules, and I shall choose 2 doors for you to open.”

“This is incorrect. Each individual instance of the game you have a 66% chance of winning if you switch.”I agree completely with that and that’s exactly what I wrote: “each and every time you have a better chance to win by switching than to staying put”

What that means is that

eachtime you have a 66% chance to win if you switch and that chance occurseverytime. So the increased probability of winning by switching occurs one time and ten times. The amount of times you play simply does not matter.“my only point is that %33 vs %66 doesn’t mean the %33 is wrong.”You’re right, the 33% is not wrong, it is just less likely to be right.

Did Monty actually make you keep the donkey?

I would so own a donkey.

@satoru: yeah the roulette was a bad example but what about what i wrote right underneath that. are you saying seeing 7 people in a row switch doors and win is not going to sway your choice or would? in this game, eventually the %33 choice will have a better chance (obviously at leas 1/3 of the time) of winning.

@mduser: Indeed when the problem was first proposed there was significant amount of discussion over it. At the time the problem itself wasn’t clearly defined and there was some ambiguity as to when Monty would offer the switch oand such, so people’s logic and conclusions were all over the map.

And despite being ‘solved’, whenever it appears it does generate a lot of talk and discussion. I think the extremely counter intuitive solution to the problem causes the confusion. But don’t feel bad! Many very very very smart people, who were math professors and such, got the answer wrong too.

@satoru: Maybe I’m wrong about how that show works, but isn’t it essentially random which cases get opened? I mean, isn’t it a straight 1 in 20 or whatever chance that you’ve picked the million dollar case?

Whether you take the cash or not is based on (I thought) a pretty straight mathematical calculation of what your expected average win is (perhaps modified a bit for TV excitement), so the decision isn’t so much “mathematically what should I do” as “how much risk can I tolerate.”

Or is there something I’m missing about the game?

@Michael Belisle: That is by far the best, most concise explanation I have ever heard for why you should always switch doors. Thank you Belisle – excellent work. You should teach for a living (if you don’t already).

now i’m not talking about the best odds but trying to predict multiple outcomes. obviously if you were playing for money, winning %66 percent of the time is better than damn good, but could you possible be right more than %66 of the time, by occasionally staying with the first door.

@snoop-blog:but could you possible be right more than %66 of the time, by occasionally staying with the first door.

Possibly yes, but probably no.

@Michael Belisle: so was you apology just snark or was that sincere? i’ve never been talked to so nicely on here so forgive me for my confusion.

@satoru:

Actually, Deal or No Deal does not exhibit this property, because the person choosing cases does not know the contents of the cases.

The Lets Make a Deal problem requires the person who’s opening doors to know where the real prize really is. Monty knows where that prize is, and he’s not going to show it to you. So, if it gets down to where you have the door you chose, and the door that Monty left behind, it is probable that the door left over is the winning door.

Deal or No Deal has the contestant opening cases, and because he doesn’t know where the money is, he may open the big prize in an early stage of the game. Since the contestant doesn’t know where the million dollars is, if it really does come down to two cases left, it really is 50/50 on which case contains a million.

@snoop-blog: when monty opens an incorrect door, switching your decision becomes an inversion of states. there is one right door and one wrong door, so if you have a right door, switching will yield you a wrong door and vice versa.

so simplifying the switch operation to an inversion, consider your initial odds of being wrong and of being right, 2/3 and 1/3, respectively. now switch states.

@snoop-blog: Seeing 7 people win by switching doors would not sway me. I am of course assuming the game is fair and that they aren’t switching things behind the doors I cannot see. If the game is crooked then no matter what I am in a losing position. In this type of scenario it is more important to keep calm and simply play the probability. The recent movie “21” is a bit flamboyant, but the base premise is that you stay calm, play the odds and you will win in the long run. If you are counting cards in black jack, you might not win a particular hand, but over many hands you are going to come out ahead.

Obviously in these kinds of games, there is a huge psychological element to them. In fact casinos and such take advantage of this to suck more money out of you!

One thing is that in a casino, reward is loudly proclaimed and advertised. Where as negative outcomes are silent. Thus you only hear and see the positive outcomes, and thus incorrectly assume that lots of people are winning. Another thing, the slot machines near the main walkways are actually set to give out slightly more money than machines that are not. Thus as you walk through a casino, you see people winning money. This positively reinforces the notion that since lots of people are winning, then I should win too! Then you sit down at a slot and lose all your cash :P

so if staying with the first door occasionally could increase your odds beyond %66 (and lets say that’s important to win more than %66 of the time) when would you stay? after how many people switched and were correct? 1 out of ____? that’s where i’m going with this.

@satoru:

Actually the Deal or No Deal (let’s call it Howie’s Dilemma) is a different case. The reason for this is that instead of Monty selecting the door, you are the one selecting the cases, and thus the probabilities are different.

Example, let’s play Deal or No Deal like Let’s Make a Deal. In that case, you select a door to remain closed, and then select another door to open.

That second door you select to reveal has a 1/3 chance of containing the car. In the case where Monty selects the door, it has a 0% chance of choosing the car.

So, if you’ve eliminated all cases except for 1$ and 1M$, you do have a 50/50 chance of each value being in your case.

WTF…NOW I get it without having to repeat it 30 times and looking at columns of wins and losses.

I’d never seen the show, and the asshole (he must have been because he didn’t get straight to the key factor in this) insisting I was wrong some years back when I was wrong…never mentioned the “ALWAYS OPENS A DOOR TO REVEAL A GOAT” part.

That makes it pretty obvious.

@thefncrow: I was thinking in a more hypothetical situation in the final end game. Let us assume I have chosen case 1, and magically burned through all the other 28 cases. Now on the board I have $1 or $1 million left as possible outcomes. While yes I chose the intervening cases, this is irrelevant. I am left now with a ‘goat’ and the ‘car’. The reality is that the probability I picked the $1 million case initially is still only 1/30. Therefore it is better for me to switch cases for the 29/30 odds.

I actually don’t know if this is how the game works in principle. Do they give you a switch option at the end? I’ve never seen it get to that point with any of the few shows I have watched.

@snoop-blog: ???

out of people who switched, 2/3 of them should be right. this says nothing of whether or not future people will switch

@snoop-blog: If you only have one shot at it, then your best chance is to take the action that has the higher probability of success. That is the whole point of probability.

Of courseyou will still succeed or fail on luck.@tizzed: I really don’t think there is a difference in the scenarios. Only in that obviously I might accidentally chose the $1 million case to be eliminated, at which point my odds become 0%.

But let’s assume you, in a fairly unlikely event, make it to the end with $1 and $1 million, we are still in the same scenario as the Monty Hall problem. Just as revealing the goats does not change the initial probability in the Monty Hall problem, choosing specific doors yourself doesn’t change the initial probability either. I will note that this applies only in this VERY SPECIFIC END GAME SCENARIO ONLY.

look, think of it this way. instead of trying to pick the car, try to pick the goats. you’re twice as likely to have picked a goat during your initial pick.

that’s where the 2/3 probability is. the host always reveals a goat. he won’t reveal the car. think about it. that means both goats are “revealed”. switching gets you the car.

truthfully, i didn’t really “get it” either. what a stupid stats problem. don’t gamble. go pay a bill. seriously, log into your online banking and pay a bill.

I played 20 times, 10 of each. I won 50% of the tme I switched, but only 30% of the time I stayed.

(And I wanted to try the new image commenting thingee.)

When Casinos put in the boards showing the previous results, they saw their roulette income rise dramatically, because people saw a string of black and thought either “bet on black, its hot” or “bet on red, its overdue”. While the odds of 20 Black in a row is 1 : 1,048,576, if you see 19 Blacks in a row, a 20th black is 50/50. (does not count 0 and 00 for easier math, remember 1 out of 19 times the house wins all red/black bets)

All Casinos know that past results have no affect on future outcomes. In the monty hall problem, just because losing by switching is 1:3, it never becomes ‘due’. If you did the game 99 times, you would expect 66 wins and 33 losses, but if you won 98 by switching, you should still switch the 99th time, you’ll win 66% of the time.

@satoru: The Monty Hall Problem does not extend to Deal or No Deal because the cases are opened at random and the host does not know what’s in them. Deal or No Deal is much more of an “Expected Value” game. Expected Value = Total amount of dollars in all of the cases divided by the number of cases left. If the deal they offer you is higher than the expected value, you should take the deal. In the earlier rounds, the deal that the banker offers you is usually only 50-80% of the expected value. This discourages people from taking the deal and extending the game to make it more exitciting. The deal approaches expected value as the game progresses (an exception is when a big amount gets knocked out and the banker usually offers way less than expected value.)

@monkeyboy13: Interesting I did not know that the casinos saw a revenue increase when they put those roulette boards up. I suppose this is logical, since everything in a casino is purposefully placed to maximize the amount of money the suck out from people. Playing into player’s instinctive urge to follow the Gambler’s Fallacy is a good way to increase revenues.

@snoop-blog: Actually knowledge of previous spins in roulette does not help at all since each spin is statistically independent ([en.wikipedia.org]). In fact, psychologically speaking, knowledge of previous spins would probably be a disadvantage since it may bias you to (incorrectly) favor some bet when there is no mathematically justifiable reason to do so (basically what this article is trying to say).

@jeff303: if you read all of my comments (which there are a lot) then you will see i admitted the roulette was a bad example.

Just watch the movie “21”, Spacey explains it.

I already knew about it from High School Physics anyway, so it’s fun to see everyone cry about how it doesn’t work when they don’t have a clue!

@snoop-blog: Yes, there is still a possibility that if you stay, you will win. Which means, out of 100 chooses, it is possible to win %100 of them all if so many of your choices are to stay.

But, mathematically, there’s no way to know which of the times you should stay and which times you should switch. You would need some extra information about the situation that isn’t given. In this problem, everything is assumed to be random, so there is no extra information. On a game show, you could try to gleam some psychological information like you’re proposing, but you’d be better of if you just assume they randomized things as well, because it would be to their benefit to randomize it once they’ve fixed the rules, even if the same outcome arises ten times in a row.

Therefore, all you have to go on is the chance the prize is behind your first choice or behind the other door. That’s a simple %33 and %66, respectively, so your only choice should ever be to switch, because there’s a higher chance of winning that way.

Um. Firts I played the game about six times. I switched for half and stayed for half, and I got a goat every single time.

Then I read the explanation. Then I played seven more times, switched every time, and got a car every time.

Not a huge number of trials, but is their randomization a little off?

@chrisjames:

said: But, mathematically, there’s no way to know which of the times you should stay and which times you should switch. You would need some extra information about the situation that isn’t given. In this problem, everything is assumed to be random, so there is no extra information. On a game show, you could try to gleam some psychological information like you’re proposing, but you’d be better of if you just assume they randomized things as well, because it would be to their benefit to randomize it once they’ve fixed the rules, even if the same outcome arises ten times in a row.

you win. you just proved my point. there’s no way of telling when to stay and when to not. as it is not probable for your stay to be correct say 100 times in a row, it is still possible. so really, it’s a crapshoot.

@tweemo: It’s possible that it’s rigged (like this game: [www.guessyournumber.com] ) but why would they do that?

You should always switch; No matter what. You win 66% of the time that way. It’s not worth it to try to be right the other 1/3 of the time. Case closed.

@snoop-blog:

so was you apology just snark or was that sincere?A little of both. I didn’t fully understand your post when I said “A winner is you!”, so shame on me. But I’m stlll pretty sure it’s wrong.

All in all, it’s zero sum here.

@tweemo:

Not a huge number of trials, but is their randomization a little off?I had the same problem. I gave up after thirty tries or so where my odds were around 85% when I switched and 15% when I didn’t.

Somebody’ll have to click a few thousand times to verify. Any volunteers?

“attacking commenters on here is a troll move. you notice how@MrFalcon: manage to show a little class, something you’ll never have.”

“attacking commenters on here is a troll move. you notice how@MrFalcon: manage to show a little class, something you’ll never have.”

Would it be correct to call that an ironic statement, or just hypocritical?

Would it be correct to call that an ironic statement, or just hypocritical?

Oh jesus christ, apparently the preview checkbox doesn’t work for me. Sorry about the comment spam.

This is one of those things that will get people to argue forever. Unlike most things with that property, here, one side is absolutely, provably, wrong.

Regardless of your 1/3 chance of picking the correct door at the beginning, by showing you one door, you know it has the booby prize (goat or canned squid)

But now you are being asked “Do you want to keep your original choice, or go with the other door”

That is the same as showing two doors, knowing that one door had the prize, and saying “Which of these two doors do you want”

You can keep your original door, or pick the other door. two choices.

Both, at THIS point in the game, have a 50/50 chance of containing the prize.

Hey, I’m switching teams here.

I understand it now, and I must agree… You should switch. You have a 2/3 chance of picking the wrong door. Sure, now that there are two doors, there is a 50/50 chance that the car is behind either of them, but your original choice is most likely wrong, and all he did was eliminate the other “wrong” choice, so by switching, you pick the “right” choice — That is, 2 out of 3 times.

I think I just blew a gasket.

@snoop-blog:

Man I think that you should give up the ghost on this one and say you were wrong… You’re digging yourself a bigger and bigger hole by continuing to argue that it’s somehow NOT wiser to switch. it IS wiser to switch. That’s the point that’s being made, regardless of what you try to say about it still being “chance.” Yes it’s chance, it always was, but if you had the choice of picking 1 winner out of 100 or 1 winner out of 3, which would you pick? or would you say that it doesn’t matter they’re both chance? OBVIOUSLY both staying or switching are both chance occurrences, BUT switching yields statistically higher results, independent of past or future runs.

Give it up man. Eat your crow now while it’s still bite-size.

@snoop-blog: As a guy looking to hire a math major, you should give me your name so I know not to hire you.

@hypnotik_jello: Certainly failed manners class.

@snoop-blog: Yup. It’s a crapshoot, alright. But it’s a crapshoot with much better odds if you switch doors when offered the chance. Given that you’re a math major, I’m not quite sure I understand why you’re arguing against the probabilities unless it’s just to play devil’s advocate…?

@BloggyMcBlogBlog: It still does apply to Deal or No Deal, but not exactly in the same way. Because you don’t know what value is in each case, you don’t have a 100% chance of not knocking out the grand prize. But this doesn’t matter because the grand prize isn’t the only prize of value. If DoND had only one $1 million case and the rest were all for a penny, you’d be right. However, even if you knock out the $1 million case, you still have the $750,000, $500,000, etc.

Now, with that out of the way, when you play the game, the probability of you knocking out a specific value on the board is always 1/N, where “N” is the number of values on the board. So, with just your case and one other unopened case, the probability of the unopened case being the highest value left is 50% (1/2). However, the probability of

yourcase being the highest value left is still the same as when you initially selected it: 1/26. Unless it sucks, you should take the deal.This may sound stupid, but I’d rather have the goat than the car. Goats get better mileage than my car or my tractor.

@satoru:

Okay, let’s break this down.

What are the chances that you will get down to 2 cases : 1$ and 1M$? This happens in 2 cases

1) You select the 1M$ case, and then select the other cases, leaving the 1$ case unopened. The chances of that happening are 1/30 * 28/29 * 27/28 * 26/27….* 1/2

That’s 1/30 (chance of picking the 1M$ case) times 28/29 (choosing a case other than 1$…so on and so on)

Note the numerators and denominators cancel each other our, and you’re left with 1/(30*29)

Scenario 2 : You choose the 1$ case, and avoid the 1M$ case.

The chances of this are 1/30*28/29*27/28…*1/2 = 1/(30*29)

Exactly the same, so given that there are 2 cases left, and 1 M$ in one case and 1$ in another case, there is a 50/50 chance of it being in your case.

In other words, one is unlikely to pick the 1M$ case, but once it is chosen, it is very unlikely (i.e. nil) that you will reveal it upon opening cases. However, It is likely that you will not choose the 1M$ case, but it is unlikely that you will not reveal it when opening up other cases. These likely/unlikely scenarios have the exact same chance of occurring.

See how this is different from Monty’s dilemma because there is a chance you will never get to the endgame choice because the big prize is revealed, whereas in Monty’s dilemma, it is never revealed.

I’ve never understood why people have such a hard time understanding this puzzle.

Look – instead of three doors, imagine it’s a deck of cards. The ace of spades is the only winner. You pick one at random. Before you look at it, I reveal all but one of the remaining cards to be losers. Do you really think your odds are 50/50? They’re not.

Look at it this way. When you pick a card, it is 1/52 that you got the ace. It is 51/52 that the winner is in the other pile. No matter what I do to that second pile – wtiting on them with sharpies, tearing them in half, or turning most of them them over to show their faces, the odds remain 51/52 that the winner is in that pile.

So now you have two piles – your pile, with a 1/52 winning chance, and the other pile with a 51/52 chance of having the winner and only one single card still unrevealed. The choice is obvious.

When you pick a door, you have a 1 in 3 chance of it being the car. nothing that happens afterward can change that fact. So no matter what happens afterward, you sill have a 1/3 chance of having picked the winner. Since there is only one other door at the moment of choice. It must have a 2/3 chance of winning.

In the traditional Monty Hall puzzle – 3 doors, 1 car, he always reveals 1 loser after you pick – switching is ALWAYS the better strategy.

GAH!

Sorry about the wall of text there. Here it is again properly formatted:

I’ve never understood why people have such a hard time understanding this puzzle.

Instead of three doors, imagine it’s a deck of cards. The ace of spades is the only winner. You pick one at random. Before you look at it, I reveal all but one of the remaining cards to be losers. Do you really think your odds are 50/50? They’re not.

Look at it this way. When you pick a card, it is 1/52 that you got the ace. It is 51/52 that the winner is in the other pile.

No matter what I do to that second pile – writing on them with sharpies, tearing them in half, or turning most of them them over to show their faces, the odds remain 51/52 that the winner is in that pile. So now you have two piles – your pile, with a 1/52 winning chance, and the other pile with a 51/52 chance of having the winner and only one single card still unrevealed. The choice is obvious.

When you pick a door, you have a 1 in 3 chance of it being the car. nothing that happens afterward can change that fact. So no matter what happens afterward, you sill have a 1/3 chance of having picked the winner. Since there is only one other door at the moment of choice. It must have a 2/3 chance of winning.

In the traditional Monty Hall puzzle – 3 doors, 1 car, he always reveals 1 loser after you pick – switching is ALWAYS the better strategy

OK, clearly ewither the commenting is broken, or my brain is, so I’m done for this morning. Sorry about the double post, I really did format it correctly the second time, I thought.

@tweemo: “Not a huge number of trials, but is their randomization a little off?”

No, the randomization is fine, but your sample size is off. 13 trials is insufficient. Run it for a few hundred times with each strategy and you’ll get results more in line with the expected returns.

I made the mistake of assuming that it was random. In most computer-controlled games, you’d end up with a 50/50 chance on the last one.

So really this is just a simulation of a rigged contest. Boring.

@Maged: Your numbers are off… At the end of the game, there can’t be a 50% chance the other case holds the million but only a 1-in-26 (~4%) chance that yours does. You haven’t accounted for the other 46%.

Deal or No Deal doesn’t follow the Monty Haul paradigm at all because the person opening cases doesn’t have the knowledge required to pick the “goats.” After every choice the contestant makes, all the probabilities in the system re-set.

If Howie somehow knew where the million was, and had to choose cases that DIDN’T hold the million dollars, it would be just like the Let’s Make a Deal problem. It’s not, though.

At the beginning of the game, you have a 1-in-26 chance of holding the million dollars. As cases are eliminated (as long as the million isn’t revealed), your odds increase until, at the end, if there are 2 cases left and $1m still in play, you have a 50% chance.

satoru: You’re right, theoretically. A possibility remains that there’s an underlying reason why you flipped 10 heads in a row that makes the outcome actually not what it would be theoretically, like having a weighted trick coin or something.

Basically all this thing is saying is, “Do you want to pick two of the doors or just one?” Because if you switch you’ve essentially picked two of the doors. If you stay, you’ve picked one. For those that can’t understand this, I feel sorry for you.