Deal or No Deal: A Statistical Deal

Deal or No Deal is clearly America’s newest game show fetish. Hell, if it can bring Howie Mandel back from the throes of irrelevance, then it’s obviously captured the fascination of the public. Regardless, I have to add my name to the list of people who are hooked on this new TV chew toy, but I’m convinced that my love for the show is reflective of some kind of masochistic tendencies that I must be suppressing.

Let’s face it: it’s pure agony watching someone turn down hundreds of thousands of dollars in hopes that they’ll make more, only to watch them get burned in the end and settle for far less than the amount that they already turned down. Despite that, playing the risk/reward game with such large sums of money is enough to catch anybody’s attention, and I think people in general are attracted to games of this nature (Vegas, anyone?).

Last night’s show was extremely interesting because one of the contestants appeared to make the biggest bonehead move imagineable given her situation. Here was her scenario:

Deal or No DealShe had 5 suitcases left that contained the following amounts: $100, $400, $1000, $50,000, and $300,000; and the banker offered her $80,000 to quit. Deal or no deal, miss thang?

Unbelievably (to me, at least), this dumb woman, at the behest of her equally dumb husband, uttered the words the audience was just dying to hear: no deal. In a beautifully ironic fashion, the very next suitcase she opened contained $300,000, thus reinforcing the idiocy of her severely misguided decision. And of course, the audience offered up the oh-so-sympathetic and stereotypical “Ohhhhhhhhhhhhhhhhhhhhhhh.”

Don’t get me started on the studio audience…They remind me of a bunch of yahoos standing around a keg chanting “Drink! Drink! Drink!” while some socially inept reject tries to pour as much beer down his gullet as possible in order to attain five minutes of superstar status.

A closer look at the game within the game

While watching last night’s show, I had this intrinsic feeling that the woman just made a hideous decision, but at the time, I hadn’t come up with a mathematical reasoning to back my feelings. It was one of those gut instinct things that you just kind of know…ya know? This morning, however, I figured I may as well look into the math a little more just to see if I was correct in my opinion.

Oh, and I sincerely hope that one of you make it on the show someday so that you can use my “Deal or No Deal Strategy Guide” to your advantage. Anyway, on with the numbers…

If you consider only the suitcases in the scenario outlined above, then there is an 80% chance you’ll walk away with at least $30,000 less than the offer that’s currently on the table. Then again, you always have the banker in there as the x-factor, so that argument represents a worst case scenario. For me, this isn’t quite exacting enough, so now we need to look for a more defined strategy.

Enter probability theory, and more specifically, the value that you can expect to earn based on the number of remaining suitcases and their associated dollar amounts. Not surprisingly, this is astutely named – wait for it – the expected value.

At the beginning of Deal or No Deal, the contestant is presented with 26 suitcases that contain the amounts shown in the previous image, and the expected value can be calculated from the following equation:

If no cases have been opened, then this value computes to approximately $131,477.54.

You mathematicians out there will already have noticed that the expected value for Deal or No Deal is simply the arithmetic mean, or more simply, the average dollar amount remaining in the cases. Risk aside, accepting a “deal” for less than the mean should generally be regarded as a gutless, weak decision, and the contestant should be ridiculed accordingly. However, late in the game, if a savvy contestant were to wrangle an amount out of the banker that is greater than the mean, then he or she ought to be carted off stage like Mike Ditka after the ’85 Super Bowl.

Okay, so maybe that’s a bit of an exaggeration, but I invite you to look at things in a purely mathematical light here. If you don’t consider “luck” to be of any help to you (and you shouldn’t – although I see you over there with that scratch-off lottery ticket!), then when you begin the game, your goal ought to be to “beat the mean.” Obviously, the mean changes as suitcases are removed, but regardless of the mean at any given time, your goal should remain the same: beat the mean.

Let’s say that you got unlucky and blew off the 13 most valuable cases on your first 13 suitcase removals. It should be abundantly clear at this point that you’re not going to walk out with a wad of cash, but you should still be expecting no less than $185.85, which is the mean. If you made it to this particular point in the game and the banker were to offer you $200, then in my less-than-humble opinion, you’d be an idiot not to take it. If you prefer facts to my freewheeling opinions, then try this on: it would be a statistical mistake not to accept this offer.

So, with this in mind, let’s revisit our woman from the scenario above. Remember her? She’s got 5 cases left that contain the following amounts: $100, $400, $1000, $50,000, and $300,000. The banker has offered her a cool $80,000 to get the hell off stage and leave in such a way as to epitomize the phrase “ignorance is bliss.”

A quick calculation reveals that the mean of the remaining amounts is $70,300. The banker’s offer is $80,000, which represents roughly a 13.8% increase over the mean. This isn’t Wall Street, but SELL SELL SELL!

Look a little closer at the reality of the situation. 80% of the remaining briefcases have at least $30,000 less than the banker’s proposed amount in them. The only guaranteed way she can do better than the proposed amount is to actually be holding the $300,000 case, because if she were to remove more cases and reveal amounts less than $300,000, the banker’s offer would likely go up to compensate for the increasing mean. Keep in mind, however, that there’s only a 20% probability of this happening! Oh, also keep in mind that the $80,000 offer is guaranteed. That’s cash money, and all bets are off! 100% chance of success…going once…going twice…Ah, forget it.

After a closer look at the numbers behind the game, it’s clear that the contestant in last night’s game made a decision that was not supported by statistical analysis. Instead of the best case scenario, which she was betting on, she actually got the worst case scenario because the next case she opened contained the $300,000. Ouch.

I haven’t mentioned it prior to now, but after that little slap in the face, the banker came back with a gaudy offer of $21,000, which was 63.1% greater than the new mean of $12,875. Looking back at this, I almost wonder if the banker had simply resorted to toying with this poor woman based on her previous decisions. Either way, the lady quickly wised up and accepted the overly generous $21,000, which represented a $110,477.54 loss against the starting mean of the game.

Statistically speaking, fewer than 25% of the contestants on the show can expect to perform worse than this lady, but then again, I guess now she doesn’t look so bad. (This comes via standard deviation analysis, which I didn’t want to bore you with)

I know it’s a game show, and I know hundreds of thousands of dollars is a lot of money, and I know how thrilling it is to go for the gusto. While statistical analysis does not really provide for “gut feeling” or “made for TV drama,” it can serve as a guide for telling you how well you can reasonably expect to do given a set of variables.

If you’ve ever been to a casino, then you know it’s never a good idea to bet solely on the best case scenario like the lady in our example. And as far as Deal or No Deal is concerned, I highly recommend that you never bet on the best “suitcase” scenario!

Take the Next Step!

  1. Share this on Twitter:
  2. Share this on Facebook:

450 comments… read them below or add one

Jesse March 30, 2006

Excellent post. I’ve been wondering about the math going on behind the show, but I haven’t bothered to do the figuring like you have. Very interesting


Jesse March 30, 2006

Just played the game online. I settled for 52,000 and would have gotten 300,000 had I gone through. So hard to know


Chris P. March 30, 2006

Hey, I just played it twice myself. The first time, I beat the mean but settled for $178 (I had an uncanny knack for picking all the money cases).

The second time, I refused to budge on anything less than the mean, and the banker never offered more than the mean. I eventually got down to two cases: $400K and $200K, and the dealer offered me $213,000. Given the odds, I absolutely risked the $13,000 in hopes of winning $200,000 more. Unfortunately, I was holding the $200,000 case, so that’s what I walked away with. Statistically, though, I derived maximum value from the game, so I was happy :)


Jesse March 30, 2006

I just played again and accepted 259,000.

I had the 700,000 suitecase.

Despite knowing all the stastics behind it, you still have to decide when to take the slight gamble over the decent upside. I can’t even imagine doing it in real life


Chris P. March 30, 2006

How many cases were left when you settled for $259,000? It looks to me like $259,000, although a lot of money, was less than the mean at that point in the game!


Jesse March 30, 2006

There was one with 200, one with 200,000, and one with 750,000. Looking back I should have said no, because the mean was 300,066, but I didn’t really give it much thought.

And in the end, though these statistics are nice, you can’t exactly stop after every pick and pull out a calculator to figure out what your next move should be


Chris P. March 30, 2006

Tru, I began using approximations…You had roughly $950K and three cases.

$950K divided by 3 is a little greater than $315K, so based on that analysis, $259,000 is a gross underestimation of the mean (20%).

On the other hand, you made $259K! Score!


Jesse March 30, 2006

Yeah, I think you have to consider not only the relation to the mean, but also the relation to the starting mean. You figured that the starting mean is $131,477.54. That means that the 259,000 was almost double what I had started with without working. Considering there was a 2:1 chance I would be ending up with something less, and a 1/2 chance I’d be going home relatively empty handed.

btw, your website isn’t remembering my personal info when I check the box…


Chris P. March 30, 2006

Yeah, it’s still buggy. I’ll just add it to the ole’ Ta-Da list…

Item #2438.b.12.



Jeremy March 30, 2006

I made 302k.


If only it was real money.


Dennis March 30, 2006

Cool overview – I hit you with a Digg on that one!


Pete March 30, 2006

Just curious, as I’m not a math wiz here, but in the example you gave, using the mean is a way to compute out all of the remaining (5) cases. Shouldn’t there be another variable in the equation which calculates the odds of the next pick being above or below the mean? With 5 left, there was only a 1 in 5 chance that the person would pick $300,000 (which she did). Usually at that point in the show there’s only one suitcase to pick before another offer so aren’t the odds in her favor at this point that she’ll pick a suitcase below the $80,000 offer and therfore increase the next offer from the banker?

Either way, good article and something I’ve been curious about myself.


Chris P. March 30, 2006

Pete, it’s funny you should mention that, because I had actually written an aside in anticipation of somebody mentioning exactly what you said. I actually edited the paragraph containing the aside so that it mentions the fact that the $80,000 offer from the banker is guaranteed, whereas playing the game further requires more risk (and possibly getting less than the mean, or in this case, $59,000 less than the previous offer).

Yes, it’s true that the woman had an 80% chance of eliminating a case of lesser value, but she was effectively extending her risk in the face of a guaranteed reward that was 13.8% greater than the mean at the time. Statistically, the proper choice would have been to hedge the risk and go with the $80,000, but then again, I guess that’s kind of a matter of opinion! :)


Pete March 30, 2006

Excellent point. Maybe this explains why I never win at Vegas. :)


Ryan March 31, 2006

I played and won 101,000. If “winning hypothetically” wasn’t enough, this guy I know from college DID win $99,000 on the show! Is it too late to send in that audition tape? Ugh!


Renee April 1, 2006

I tried to understand the show some last night, but it did boggle my mind a little. Glad you explained the odds a little so I could really appreciate what the contestants gained/lost.


Chris P. April 1, 2006

Last night’s show was just about as good an example of a “statistical showdown” as DOND could possibly present.

The contestant managed to get down to three remaining cases: $50,000, $100,000, and $750,000. This put the mean value at $300,000, and the banker, after a series of woefully inadequate offers (offers that were at least 40% less than the mean – boo!), finally came through with an offer of $252,000.

Now, if you were to adhere strictly to the rule that says “anything less than the mean should be rejected,” then you can’t accept the $252,000 offer. I’ll be the first to admit, however, that this is reality; these are real people; and $252,000 is a decent chunk of money.

With three cases remaining, there was a 66% chance that the contestant’s case contained at least $152,000 less than the current offer. However, there was also a 66% chance that the mean would improve if he were to select another case! If nothing else, the paradox here is enough to drive you nuts.

Although the offer of $252,000 was 16% less than the mean at the time, the contestant said “deal,” thus ending the game. Perhaps the thought of risking at least $152,000 for a *potential* reward was more than the contestant could bear, or at least $252,000 was enough to send him home happy (understandably so!).

Regardless, the good folks at DOND always play out the game to see what would have happened, so we got to see what the contestant’s fate would have been had he continued. Here’s how it went down:

The next case he selected contained $50,000, which represented his best case scenario (although the $100,000 case wouldn’t have been that much worse – think of it as an insurance policy). The resulting offer from the banker was $498,000, which eclipsed the mean by a whopping 17%.

At this point in the game, I would have accepted the offer in spite of the fact that the $750,000 carrot was still dangling out there on a stick. Why? Because I would have accomplished the goal of the game, which is simply to beat the mean. It is important to remember that the game is about the mean and not about the numbers on the board!

This was TV, though, so at this point, Howie had the contestant open up his case to see what was inside. Amazingly, the contestant’s case held $750,000!

I’m sure he probably felt positively sick upon seeing this, but he should keep in mind that $252,000 puts him in the top 20% or so of people who play the game (assuming enough games were played to make up a good statistical sample). He still beat the starting mean of the game, which was roughly $131,000. He should feel extremely good about that.

On a final interesting note, I’d like to add that when the contestant had 3 cases remaining, he was basically in a position of risking at least $152,000. Based on the banker’s next pseudo-offer, he stood to gain $198,000. In my opinion, risking that much money is too big of a gamble, even if the odds of improving are 2 in 3. Like I said earlier, this is the real world, and $252K goes much further than $100K or less! Plus, the guy had four young kids…what would you do?


Tim Barrett April 4, 2006

Nice blog. I found it on today.

I love the math angle of DOND. I’m hooked, even though Howie has a soul patch (which is just plain wrong). I played the online version and settled for $305,475 with 3 cases left (200, 750, & 1,000,000). I was holding the $200 case. Nice deal. :-)


Chris P. April 4, 2006

Bad deal! You had an amazing insurance policy with those huge numbers out there, and you were worth over $666,000 based on the mean! I think you sold yourself short. Then again, you’re $300,000 e-dollars richer, so what do I know?


J.Lessard April 11, 2006

Im a highschool AP Computer Science student who is absolutly hooked on this game. My teacher (also the AP stats teacher) is requiring us to make an advanced program using java. I’ve decided to make a version of Deal or No Deal. Problem is I cannot come up with a model or equation to estimate the Banker’s offer. This has been plaguing me for the better part of a semester.

Any suggestions or thoughts you’ve had? Any help would be greatly apprieciated


Chris P. April 11, 2006

What a badass problem! I would do the assignment just for fun, but then again, I have issues.

Anyway, here’s what I’d do (this ought to suffice for a HS project):

At each point in the game, you need to compute the mean (which is based on the value of the remaining cases). In order to calculate a pseudo-offer from the banker, I would suggest establishing an inverse relationship between the number of cases remaining and the current mean. For instance, if 18 cases remain, take the mean and divide by 18. In my opinion, this offer would be grievously low, so in order to compensate, I would acutally divide by the number of remaining cases over 1.75. So instead of dividing by 18, you would now divide by 10.29.

If possible, I would actually try to migrate this divisor as the game progresses, perhaps subtracting a tenth every third suitcase (1.75, 1.65, 1.55, etc.). That way, when there are only 5 or fewer cases remaining, the offer will be closer to the actual mean (which resembles the game).

The bottom line here is that the banker is using the mean to determine his every move, and that’s precisely what your algorithm should do, too.


J.Lessard April 11, 2006

Hey thanks a lot. I checked it out against the online game and it was close enough. It doesnt really need to be perfect but i’d like it to be. I’m going to keep looking for that single, perfect equation.

In the mean time, that should work out. I’ll set up a link for the applet once it is finished.


Chris P. April 11, 2006

Excellent…Can’t wait to see it!


digdug April 13, 2006

I was doing some mathmatical tending on the online game, running many scenarios. They basically work straight off the average, but take a % off all the higher values (and that % lowers as the game goes on) … But that varies from the show a bit.


Anonymous April 20, 2006

If you truly want to do the stats on this, I have one word for you: “Beyes”. Now that will start a riot!


Rick April 21, 2006

Great post! I am a six sigma guru at work and deal with statistical analysis everyday… The first time I saw the show the contestant did something similar (throwing away thousands) and I haven’t watched since.

One thing to note though is that at the very beginning they have to pull 7(?) suitcases, so your expected value should start from there.

Also, unlike in Vegas, the contestants have $0 on the line… which does ugly things to the brain “ah! i have nothing to lose!”




Jesse April 25, 2006

Rick made an excellent, excellent point- contestants don’t actually have any money on the line. If they go home with zero dollars, they aren’t any worse off than they started, other than kicking themselves for being flaming idiots. When there’s a bag with 3/4 of a million, a bag with 1k, and a bad with $.01, it’s easy to gamble and turn down the 300k offer, because they aren’t actually costing themselves money that they would have had, if that makes sense…it screws your mind when you don’t actually have the money


brian May 2, 2006

You are missing a very valuable point in your example. If the contestant goes on, they have an 80% chance of picking any case besides the $300000 case. This will in effect raise the next amt greatly. Not that Im saying they made the correct decision, but I think you missed that point.


Jesse May 4, 2006

Last night a girl had 100, 500, and I think 25k. They offered here around 9k and she accepted. She made the right choice, but she did horribly in the grand scheme of things


Chris P. May 4, 2006

I think the show’s producers would like you to believe that people are always playing for tons of cash, but the normal statistical distribution indicates otherwise.

Independent of her decisionmaking, that lady’s final result is somewhere between the 20th and 40th percentiles, so it’s really not as “horrible” as it looks.

She just happened to be a statistic on the lower end of things.

You can’t fight the statistics – all you can do is hope to beat whatever mean you end up being presented with late in the game. If that’s $500K, great! If it’s $180, go get you a nice dinner and some wine to forget about your less-than-stellar appearance on DOND.


Tom Flowers May 10, 2006

Brilliant analysis! I hope you and your readers continue to make observations about the show. I think Howie does a great job as the host; but what’s the probability that he will actually shake someone’s hand? Purportedly, Howie suffers from Chirophobia or Hand Phobia; or some other social/anxiety disorder . . . Howie: Shake or No Shake?


Alan Hollis May 13, 2006

I watch the english version of the game over here. Im surprised no one has mentioned any thing about the “switch” option often presented at the end of the game by the banker.. Do you get that in the american version?


Chris P. May 13, 2006

If we do, I’m not familiar with it.


joe TBD May 14, 2006

First off, I will point out something that should be obvious to anyone, that has seen the show and played the online version. THEY ARE NOT THE SAME! The online game uses a different equation, for the ammount offered.

It should be obvious, because the show usuall offers about 10K for the first offer. While the online version tends to offer about 25K for the first round.

Second, I am only familiar with the U.S. (NBC) version of the game.

[now that we have that out of the way].

The poster, DigDug, is correct. They are offering the contestant a % of the average. In the beginning, the % is low (about 10%). At the end (with 2 numbers remaining) the average is 100%.

You can confirm this rather easily. Look at the first offer, and then look at the offer, with just 2 cases remaining. Its rather simple.

For example: if the average of the remaining ammounts is $100,000, then the offer will be $11,000 (after the very first round).

But, at the end of the game, if the average is also $100,000, then that offer would be $100,000(100%)

See the difference? Your offer jumped from $11,000 to $100,000, dispite the fact that the average of the remaining cases was exactly the same.

In short, the producers of the show, want the contestant to keep playing. ANd they tempt them with better and better offers. I mean, really, who is going to settle for 10% of the average?

Somebody worked this out (sorry I dont have a link), and they figured it was something like 11% for the first round offer (im talking about the TV show remember). The next offers are something like 15%, 23%, … all the way up to 100%, for the final round.

The % does NOT increase steadily. It increases only slighly in the beginning. But more drastically later. The second round is only 16%` (roughly).

You should not settle for any of the early offers, if you can help it.

There is one round, (I dont remember which exactly, sorry)the offer goes from 39% of the average to %60 or 65% of the average. It is the biggest jump in the game. And as soon as you see that jump, it would be a good idea to seriously consider making a deal.

The % of the offer, will continue to increase, all the way up to 100%. But by now, you are in serious danger of knocking out your highest cases (since you only have 5 cases left).

You’ve got a 40% chance of eliminating one of your 2 highest ammounts, at this point. That is not a chance I would want to take.

And thus, for me, that is when I would deal (if I made it that far).


Gneo May 31, 2006

Thanks for making this!
This has helped me, I’m supposed to be making a deal or no deal game using HTML and javascript. For my computor programming class.
Thanks again!


John June 2, 2006

there are two important factors in playing. That is the mean of the dollars left but also the median value (middle value when arranged in order). You start with 131 k in average but the median is between 750 and 1,000.

The problem with just looking at the average is you are tempted to play the odds. If I was to offer you $200 k or a 1/4 chance of a million, you are crazy to take the chance unless $200 k is not big money to you. You lose 3 out of 4 times going for the big money.

My strategy would be to look at an average case and worst case senario and see what the dollars look like. In the case above, worse case says you pick the million essentially losing the whole 200 k offer. Average case says you pick a low number, then have a 1/3 chance of winning a million. Call it a $300 k offer. So you are risking a million to win an extra 100 k.

In the dumb lady example you started with, worse case is she picks 300 k and gets an offer around 12 k (risking 80 – 12 = 72 k) to win. Average case says she goes down to 4 cases with maybe a 90 k offer. I think she risked 72 k to win about 10 k.


Chuck Reynolds June 11, 2006

First time I played that flash game I won $392,500 – with a screenie to proove that I should have been on the show.

Nice article, but I’d also like to figure out how they’re figuring out their offers because it doesn’t seem to follow an exact equation – at least not one I can see.


Marissa June 21, 2006

I just played once, and i won 500,000. It came down to 200,000 and 500,000 the bank offer was 243,000. I figured what the hell, ill take the chance of losing the 43,000 and i ended up winning 500,000. this is a game of luck and gamble


pinetar July 5, 2006

>> Obviously, the mean changes as suitcases are removed, but regardless of the mean at any given time, your goal should remain the same: beat the mean.

You have a point but you’re missing something important: opportunity cost. Playing DoND is a once in a lifetime event. If you could play the game over and over again as many times as you wanted, then it would be wise to accept any offer greater than the mean. But you can only play once.

In light of this, some people might choose to accept the first six figure offer (as Howie says, “a life changing amount”). Others might have an opposite view and risk it all to win $1 million. The math certainly is relevant but you cannot discount the human factor. And that is what makes the game so interesting to watch.


Ganesh October 9, 2006

There are bunch of guys from a dutch university (ERASMUS univ., BTW, the company ENDOMOL whic produces the show is a Dutch Company), who wrote a paper on DOND. The paper is on SSRN website. You guys are really close. Here is the real deal

Bank offer = fn(mean of the remaining amts, round #(1-9), luck factor of contestant at that round)

Luck factor = mean of the remaining amts/mean at the start of the game

which basically says, if the luck factor is 1, then you are neither unlucky or lucky at any given round.

They basically divide reward with more than average, after 5 or so rounds if the guy is unlucky, but award less than average if the luck facor is greater than 0.5.

Hope this helps.


a. askar October 10, 2006

deal or no deal can be a no brainer once you reveal some important cases only if you have the power over greed to accept the deal you,ll come out ok ,follow your hunch not what the audience scream out what the hick do they have to loose,know when to hold,em when to fold ,em and when to run with a decent offer from the banker


jacob O October 12, 2006



jacob O October 12, 2006

this was so cool i won the million


Jesfer October 16, 2006

Yehey! Aus na system ko! Thanks! ^_^


michael dempsey October 16, 2006

your analysis is right on.
I think there is a screening process for the game where mathematicians are blockedout. The profile they want is low IQ and high energy. I suppose it makes for good tv, although I, like yourself, can’t take five minutes of the show.


Hayley October 17, 2006

Lets Play! Game on!


Hayley October 17, 2006

Game ON!!
I can’t wait 4 my first ame i’ve been playing on the bigbrother game Deal or No Deal and dat is shocking!!


Jim Nayzium October 17, 2006

OK…I have small six dollar bet on the line for the answer to this question….please limit your logic in the figuring of this answer to the question asked…not a bunch of hypotheticals…

When you pick the box, you are 25 to 1 to win the million.

Once they have shown you the 24 boxes proving you have the million dollars in one of the last two boxes…

are your odds of winning the million different now from the original 25 to 1…

I have many interesting thoughts on the matter…but will await responses…

the key question is —– you have to win the million….throw out any discussion of strategy or offers….I only want to know about the odds of winning the million dollars….

At the start it is clearly 25 to 1 . . . . but with two left, is it now 50 50 ???

please elaborate your thoughts….


G Mac October 20, 2006


Assuming you don’t know the outcome ahead of time (eg. you don’t have a time machine and couldn’t possibly know that it would get down to two cases…), the probability of you having $1 million increases with each non-million-dollar case that opens. It has to change because you are not placing the cases you picked back into the pool you are picking from.
Obviously, the probability of getting $1 million goes to zero if you open any case other than yours and it has the $1 million in it.

However, if you hitched a ride with HG Wells to the future, you need to change your perspective on the probability.
If you already know, before picking any case, that it will get down to 2 cases without the $1 million showing up, right from the beginning, the probability of you winning the million dollars is always 0.5 (or 1/2 or 50 50 or 50%…whichever way you want to state it)….because, right from the beginning, you are guaranteed to hold one of two possible cases that have the $1 million.

But to clearly answer you last question, with two cases left, your odds are 1/2 that you hold the $1 million dollar case.


blake October 21, 2006

Ok, rip on me if i’m wrong (and i didn’t read every single post) but –

Whether or not the offer is above or below the mean isn’t really what determines whether or not taking the offer is a good or bad decision. For example:

You are down to two cases. One is $.01 and the other is $1,000,000. The mean, unless I’m wrong is $500,000. If the offer is $400,000 do you take it? How about $300,000 or $200,000? I’d be bummed that he didn’t offer me the mean but if he offers me $300,000 I’d be inclined to take it. Reason being. That is 100% chance that if i say deal I take home $300,000. But if I say no deal then I have a 50/50 chance of getting a penny. A penny won’t help me out much. And maybe people think I’m stupid and I don’t get good job offers anymore. But I could say, “It was $200,000 below the mean. I had to say no!”

The point is I dont think you really ever play for the $1,000,000 on this show unless it’s like down to two cases and they are 1,000,000 and 750,000. Then you can say well…the offer maybe 875,000 but what the heck – No Deal. because if you are wrong you are only losing about 15%?

What do you think?


Chris P. October 21, 2006


You’re describing a pretty extreme circumstance, and in that case, the banker’s offer would likely exceed $600,000. That said, I would take the offer, especially given that it’s so much higher than the mean.

Based on the show’s history, you can pretty much rest assured that the banker’s offer would exceed the mean in the scenario you described.


blake October 22, 2006

I played the game last night online. Maybe they use a different formula to come up with offers online vs. tv, but it was down to 500,000 and 1,000,000 and for what ever reason the bank’s offer was only $532,000. Odd i thought. In terms of life affecting, the difference isn’t enough not to say “no deal” and go 50/50 for the million. This I did, but my case held the $500,000. So I bet wrong.

But the point is… imVERYho, it really doesn’t matter if the offer is above or below the mean or by how much. That’s not what makes your decision “good” or “bad”. That’s not what you should be analyzing each round. Figuring out the mean is only useful in predicting what the banker’s next offer will likely be if you say no deal and your high valued cases are pulled off the board. (You can pretty closely predict what the banker’s offers will be without formally figuring out the mean in your head.)

What you should be thinking is, “what are the odds that my high cases will be eliminated this round?” Thus whether the number of cases to be eliminated in the next round exceeds or is equal to the number of high valued cases left on the board. (This is why Howie is always talking about having a backup) This is especially more so the case when the offers move into the “life changing” amounts which most people aren’t offered in the first rounds. (Imagine if like you said the mean before the game starts is 130,000 and before you even picked your case they offered everyone 110,000. Well this would be a pretty boring game as I bet most people would just say deal. Wouldn’t take long for them to go bankrupt. What they’d burn through 15 people a show? 1.5 million per…ouch!)

So examples:

It’s down to 3 cases, and one case will be removed,
the board shows 500,000 10,000 and 5,000. the mean is about 172,000. I’m not going to play it like…well if he gives me 180,000 I’ll say deal, but 115,000 i’ll say no deal. If he says 115 it would be very hard for me not to say deal. I mean any amount that is considerable considering my economic position in life. it would be very difficult to say I’ll pass that up the guaranteed 115,000 for a 1 in 3 chance of getting 10,000. The less risk averse out their may feel differently.

If there are only 2 high amounts left on the board and in this round you 5 cases will be removed if you say no deal, well it would take a lot of thought not for me to say deal, because no matter the mean, if those 2 come up well, you are going home not much better off than when you arrived. You see my point?

Its more playing the odds than playing the mean.
And unless I’m in a situation where i’m presented with two closely valued cases at the end, i’m never going to play to the last case based on the banks typical offers, as long as the amount is economically significant to me.

So for your ex. at the beginning 5 cases, only one high valued amount, her mean drops significantly if you have bad luck ….(that being the appropriate term when you have a 1 in 5 chance that the high value doesnt get removed and it does!) But people will look at their odds and say I’ll go for it! Some win, some lose, but if your goal is giving yourself the best chance to go home with a significant amount of money you take the deal.

In this game I don’t believe a rocket scientist has much of a better shot than the average person. Luck determines if you “beat the mean” of the initial 131,000. You can play online numerous times to the final game and never be offered over 130,000. So if you’re a genius and your big numbers get eliminated early…your screwed just like the rest of us would be. Luck sets the ceiling on what you can win. Playing “dumb” reduces that amount. But dumb is sometimes only revealed when the case is opened. If she played til the last case and it had the 300,000 then she could say…it’s better to be lucky than not dumb. (I guess that’s where “dumb luck”
comes from ;)

So to recap.:

Continue play as long as your backups exceed the number to be eliminated.

And the mean seems only useful for 1) predicting what the next offer will be if my highest valued case is removed, 2) the banker, coming up with that figure, and 3) for the producers in determing if the show gets green lighted…(on the assumption that your average pay out per show will approach the mean as long as you have a high enough number of shows.)

Does that make much sense? I hope it does. Because if not, I have carpultunnel(?) for nothing.


Josh October 26, 2006

Brian had it right, but no one seemed to notice. Let’s take another look.

She had 5 suitcases left that contained the following amounts: $100, $400, $1000, $50,000, and $300,000; and the banker offered her $80,000 to quit.

You are comparing the mean value (what you can hope for in your case) with the offer. This would be fine if the offer were only extended once. However, this is not the case. Yes, she only has a 1/5 chance of having greater than the offer in her case. But that doesn’t matter yet, because she can’t take the case anyway. What matters is the option at hand: should I continue? Basically, how will this choice impact my next options. Her 20% chance of holding better is irrelevent. What matters is that she has an 80% of her offering going up next time. She happened to lose the 300k, but this was unlikely, wasn’t it? It’s more likely that she’d pick one less than the offer, causing the next offer to be higher. The real consideration is whether the mean is likely to go up or down, as that’s what determines the next offer. If you make it to the last round, compare the offer to the mean of what’s remaining, but this isn’t important until you reach that point. You have to make every decision as if it were a unique opportunity. If I told you that I’ll give you $100, but that I’ll make you another offer if you say no that has an 80% chance of being higher, you should take the offer. You’re forgetting about all of the offers that will come before you actually have the option of walking away with your case. As a disclaimer, I’ll point out that this is if you’re are trying to statistically maximize your take-home. If you need $50k for an operation, you might not want to risk it. I’m just looking at the numbers.


Blake October 27, 2006

You may want to look again Josh. Actually, Brian’s point was dealt with in my last post when I said, “….bad luck (that being the appropriate term when you have a 1 in 5 chance that the high value doesnt get removed and it does!) But people will look at their odds and say “I’ll go for it!”


Josh October 27, 2006

I thought you were going to say what I thought, but then you went in to things like “life-changing amounts” and other considerations. I just wanted to deal with the mathematics in detailed terms.


Chris P. October 27, 2006

If you can beat the mean at any point in the game, why wouldn’t you? First, you’ve completed the objective of the game (which is to beat the mean), and second, you walk away with guaranteed cash.

Statistically speaking, the banker rarely offers an amount greater than the mean, and when he does, you ought to take it. You’re not going to get many chances to actually beat the mean, so when that opportunity presents itself, I think you ought to take it.

In the example from the post, the banker’s offer exceeded the mean by 13.8%, and statistically, almost nobody beats the mean by that much in Deal or No Deal.

She should’ve accepted the offer and gone home.


Josh October 27, 2006

Since when was the goal of the game to beat the mean? The goal of the game is to maximize how much money you take home. You’re looking at what the end of the game might be instead of considering the decision at hand. She had a 20% chance of having more than the offer in her case, so it would be statistically unwise to bank on that. However, she had an 80% of her next offer being higher. You have to make the decision youre faced with. SHE WASN’T CHOOSING BETWEEN THE OFFER AND HER CASE. SHE WAS CHOOSING BETWEEN THE OFFER AND ANOTHER ROUND. Going another round had an 80% chance of increasing her offer, at which point she can decide again whether to take the new, improved offer or settle.

If you had $1, $5, $10, and $1,000,000 left on the board, you shouldn’t take an offer of $400,000, because you have a 75% chance of eliminating one of the small amounts and having a much higher offer next time. Worry about what’s in your case when you have $1 and $1,000,000 and get to choose between the offer and your case. Until you can take your case, it might as well be any other case on the stage. You have to focus on the actual outcome of your decision. When you choose BEFORE the last round, there are two possible outcomes from continuing: offer going up (good) or offer going down (bad). Figure out which is more likely and act accordingly. Most people don’t even make it to the last round, so it never mattered for them what was in there case. What mattered was the offers they received and which one they took.


Chris P. October 27, 2006

My entire post centers around the fact that the goal of the game is to beat the mean. If you’re in it to take home the biggest pay check, then you are committing to relying on luck.

There’s just as good a chance that you’ll blow off the high dollar cases as the low ones, so there’s really no strategy involved there at all.

The only strategy, then, is your decision-making throughout the game.

The only criteria upon which you can reasonably base your decision (mathematically speaking and with emotions aside) is the mean. By nature, the mean shifts up and down as the game progresses, leaving you with a different snapshot of what you can reasonably expect to earn at each juncture along the way.

You cannot expect to win an amount equal to that of any of the cases. I don’t mean to be condescending, but that’s a really juvenile way to approach the game. This is because in order to pull an amount in a case, you’d have to end up with that particular case.

The only way this would ever happen is if you got lucky.

Oh, and in your comment, you argue that if you have 4 cases remaining, $1, $5, $10, and $1,000,000, you shouldn’t take the offer.

You, my friend, are nuts.

You would be an absolute fool to turn down an amount that is 60% greater than the mean.

Also, it should be noted here that the banker would never offer $400,000 in that scenario. Something on the order of $275,000 would be more appropriate.

Given that scenario, I would take the $275,000, too.

One thing you have to remember here is that $400,000 (or even $275,000, for that matter) is much, much greater than the starting mean of the game. To triple the initial mean is to accomplish something that would land you two standard deviations above and beyond most results from playing the game.

If you can guarantee yourself a place in that part of the distribution at any point in the game, you would be an absolute fool not to do so.


David October 27, 2006

I just learned of this game today. Everyone obviously seems to be looking for the optimal strategy. In order to compute it, however, one would need to know how the banker deviates from the mean in making an offer. If the offer is always less than 100% of the mean, then the optimal strategy is always (from an expected value point of view) to never accept any offer. It becomes much more interesting, however, if the banker offers amounts that sometimes are below the mean and sometimes above the mean. In that case, one would need to know the distribution of (offer – mean) to devise an optimal strategy.

By the way, the above only pertains to optimizing wealth and does not take into account utility of wealth (that is, an individual would more likely take a sure $100,000 than a 50% chance of winning $210,000 along with a 50% chance of winning $0 even though the latter yields an expected value of $205,000.)


Josh October 27, 2006

We must be watching a different show. If you want to beat the mean, go ahead and take the first offer that’s above the mean. I, for one, want to make as much money as possible. To do that, you must make sound mathematical decisions at EVERY step.

Imagine, for a moment, that you have 6 cases left: $0.01, $1, $5, $10, $20, $1,000,000. Your mean is about $167k. By your reasoning, an offer of $180k should always be taken because it beats the current mean. But what about the next mean? Can you not see that there is an 83% chance that a small amount will be opened and the mean (and therefore the offer) will rise? I’m not saying to keep your fingers crossed that you have the million dollar case. That’s unlikely. What is likely, however, is that you’ll profit by playing another round. You can settle for anything above the current mean if you want, but if I had an 83% chance of getting a better offer next time, I think I’d be a fool not to go another round.


Chris P. October 27, 2006


With each of your comments, the situation becomes more and more extreme.

It’s hard to debate an issue when the variables keep changing.


Josh October 27, 2006

It’s hard to get you to understand, so I was trying to put it in obvious terms. It’s a game of variables. If your system works, it should always work. If it only works in moderate situations, you should state that. (It’d still be wrong, though.) A statistical system will always work, not just in situations where the numbers make your system look sensible. It’s more obvious for large ranges than small ones, but the principle still holds true. Your offers will go up if your mean goes up. If your mean is likely to go up next round, then your offer is likely to go up next round. Every choice (until the last round) is a choice between the offer and another round. What you think might be in your case doesn’t come into play until you get to choose whether or not you want your case.


Chris P. October 27, 2006

You know what, you’re right.

I must be an idiot.

Why the fuck did I put up with four years of the #3 Mechanical Engineering Institution in the country?

Why did I learn statistics and other meaningless crap when I could have just listened to your clairvoyant insights?

I don’t see why you’re wasting your time at my site if you’ve got your finger so firmly pressed upon the pulse of success and useful knowledge. You should be out getting rich and writing those “rich jerk” info product spoofs.

But you don’t need me to tell you that…Right?


Josh October 27, 2006

Wow… mechanical engineering, eh? I sure wouldn’t know anything about that. [/sarcasm] I’m sorry if I frustrated you. It’s just mathematics, and I understand that statistics can be awfully complicated. But seriously, if you have a great chance of getting a better offer next round, isn’t a system that tells you not to take advantage of that flawed? When it comes to statistics, you have to make decisions one at a time. If you miraculously flipped 30 heads in a row, you’re still only 50% likely to flip tails the next time. Each decision must be made independently, considering only the choices at hand and the possible results.

And before you flaunt your education, you probably ought to know where I went to school, what degree I received, and what GPA I earned.


Andrew October 11, 2011

flip 31 is a 50% chance. the odds of obtaining the first 30 flips all heads was 1 in a billion (literally).


Andrew October 11, 2011

My point is you are only looking at flip 31, whereas our esteemed blogger is looking at all the flips leading up to it.
In the situation you describe, at that point in the game, I would rather go with 100% chance of 180k than 17% chance of going home with less than $100. I think the overall algorithm should consider how many remaining cases are above the mean, and how many below. When you stack that many low cases it makes the decision easy for me.


Josh October 27, 2006

I forgot to answer your question as to why I posted in the first place. I find the show interesting, but I disagree with your analysis. I thought I’d weigh in. I’m sure most of the readers are capable of reading both points of view and deciding.


Chris P. October 27, 2006

Well, Josh, I could do that…

But you see, I really don’t have to, because you’ve revealed quite a bit about yourself here in the comments.

You have continually spoken about “mathematics,” and yet you’ve offered no insights of any statistical relevance to combat the equations, methods, or reasoning that I supplied in the post.

Moreover, you have also displayed a “shoot from the hip” kind of attitude with your arguments, which are very much impulsive and not backed by any sound resources. I say this for a variety of reasons, but most notably because you seem to think that the banker routinely offers amounts greater than the mean.

The reality here is that a majority of players never even get an offer that is greater than the mean unless four or fewer cases remain. In fact, most of the offers leading up to the very end of the game are ludicrous and not even worthy of consideration.

And, since we’re debating things here, I’d like to point out another hole in your argument. In your most recent comment, you make the following claim (smugly, I might add):

If you miraculously flipped 30 heads in a row, you’re still only 50% likely to flip tails the next time. Each decision must be made independently, considering only the choices at hand and the possible results.

I’m really not sure what’s so telling or great about that statement, because I make it quite clear in the post that the mean changes at every step along the way. Your decision at each juncture is based on the new mean, so it looks to me like all you’ve done is restate my argument in more rudimentary terms (with a very generalized “odds” description instead of a statistical one).

So, with all that information in hand, here’s what I believe to be true about you:

  • In the best case scenario, you attended either a state university or second tier specialty school (and no, there’s no way it was on par with a UNC or a UVA)
  • You received a degree in a major that covers very general topics — something like Business Administration, Management, or Communications (although I hope not, because you do a poor job of constructing an argument)
  • You may have received a decent GPA, but if the above bullet points are true, this really isn’t that impressive given the difficulty of both your school and your major.
  • Finally, I believe that if your qualities were really that impressive, then you would have offered them up here in an attempt to provide at least some support for yourself. I suppose I shouldn’t expect that, though, because you haven’t supported any of your arguments thus far.


Josh October 28, 2006

To be quite honest, I thought the issue was obvious enough that it didn’t require a dissertation with bibliography on statistical theory and the game show applications thereof. I am not going to discuss my education with you. I will say, however, that my wife laughed out loud when she read your guess at it, and not because you guess well. Besides, my background doesn’t make my argument any more or less valid. That, my dear man, would be committing a logical fallacy called “Appeal to Authority.” Allow me to overcomplicate a very simple concept so you can feel better.

The game of Deal or No Deal is a series of choices. The first choice (besides a random choice of case) is whether to take an offered sum or money or to continue playing. If you continue, more cases will be opened and more such choices will be made. To appreciate the statistical implications of this game, both the choices and the possible outcomes must be understood.

Choices: Until the last round, the player must decide each round whether to accept the banker’s offer or to reveal more cases and receive another offer. On the last round, the player must decide to accept the banker’s offer or take home whatever sum is within his or her case.

Outcomes: Until the last round, a decision to reject the deal results in the opening of more cases and the offering of a new deal while a decision to accept ends the game with the awarding of the offered amount. On the last round, a decision to reject the deal results in the contestant being awarded the amount in his or her case, while a decision to accept ends the game with the awarding of the offered amount.

It is especially important to note that during opening play, “No Deal” results in a new offer. To get the best outcome, we must first define the optimal results; namely, maximizing the money the player is awarded. To reiterate, money is awarded through either accepting an offer or progressing to the last round and taking the chosen case. To maximize the award, the player must make the best statistical choice at each juncture.

When faced with any offer to continue opening cases or take the current offer, the obvious question with player must ask himself is “Is continuing likely to increase my award.” Obviously, the amount in the chosen case will never change, but the offer will. Understanding the offer is necessary, so let’s digress for a moment to discuss it. The offer is loosely tied to the mean, but the percentage of the mean that the offer represents generally increases throughout the game. We can predict, then, that an increase in the average values left in the game will result in an increase in the offer. This is paramount.

The decision to deal or not deal must be based on the outcomes. Until the last round, the outcome of accepting the amount in your case is not an option. The only possible outcomes are the immediate end of the game with the offer or the continuation of the game with the revealing of more cases. As previously stated, the player must ask which decision is “better.” Assuming that a strictly mathematical approach is used, it is easy to determine whether or not continuing is advisable. If your next offer is likely to be higher, then remaining for the next offer is the best choice. If your next offer is likely to be lower, then your current offer is the best you can expect.

As this rule seems to be in contenion, let’s examine it in more detail. Say, for discussion’s sake, that you are faced with four amounts: $200, $750, $50,000, and $75,0000. The mean of this is just over $200,000. We can predict, then, that our offer will lie somewhere near $200,000 (likely below). We can note that only one of the four remaining values is over this offer, allowing us to conclude that, at this point, there is only a 25% chance that the case we chose holds a value greater than the offer. This, however, does not mean that the current offer is the most we can hope to take home. This is obvious, if we consider what would happen if the case holding $750 were opened. The average would increase to $267k, and the next offer would increase accordingly. To determine the likelihood of this is a straightforward exercise. Three of our four options lie below the mean, so there is a 75% chance that our mean, and therefore offer, will increase with another round.

Once we agree to continue, more cases will open, the mean will change accordingly, the offer will change accordingly, and the pattern continues. At each round (not including the final round) the most statistically sound procedure is to calculate the likelihood that the next offer will be higher. If a contestant were ready to settle, accepting perhaps that his case is unlikely to hold a high value, the contestant might be tempted to accept the immediate offer or an offer “close to the mean”. If it is probable that the next offer will be higher, it would be unreasonable to accept the current offer based on mathematical principles. In general terms, it does not make statistical sense to accept a lower offer if a higher offer is likely.

The final round is the only deviation from this. Continuing the above example, consider the final two amounts being $200 and $50,000. The mean (that is, the expected value) is $25,100. At this point in the game, the choice is finally between your offer or your case. Again, from a strictly mathematical point of view, any offer above or equal to $25,100 can be accepted, and any offer below or equal to $25,100 can be rejected.

The difference between this method and the method described in the original post is that this method takes into account the circle of case revelations and offers, whereas the original recommendation simply weighs the current value against the value that can expected in the first case. There are more opportunities for profit than cutting losses or taking your case. Accepting the optimal offer should be a part of any analysis.

—Well, that was lengthy—

I apologize to those of you who prefer less… academic descriptions. To summarize once again, it doesn’t make sense to accept an offer if your next offer is likely to be higher. Even after you have given up on a jackpot pocket case, you can still determine when the best time is to accept the offer. If your next offer will be lower, cut your losses and take the deal. If your next offer will be higher, why would you accept the offer? Even common sense would dictate that a 75% chance of a better following offer is something you should take advantage of.


Andrew January 19, 2010

Dude, this explanation wasn’t academic at all – just longwinded. Your point might be better taken if you spent less effort trying to shove it down everyone’s throat.


Brian October 28, 2006

In most scenarios in this game, the average payout is significantly greater than the median. I believe my strategy would be to be mindful of the median, and take the deal when the offer was a significant percentage of the the average. I would accept 60-80% of the average. Of course, if I was lucky, I raised the average by revealing low-dollar cases.


Jackio October 29, 2006

I WON THE MILLION!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


David October 30, 2006

Josh, I think you misunderstood my statement. You are correct that to make the most amount of money you should accept an offer above the mean. When I stated that one should never accept any offer it was under the assumption (as someone else had stated and as I have never seen the show, I don’t know it it’s true) that the offer was always less than the mean.

If that is not the case then things get really interesting. Suppose that (offer – mean) is symmetrically distributed (e.g normal). Then you would expect as many offers above the mean as below and deciding whether or not to take an offer gets complicated as it becomes a function of both magnitude (above the mean) and number of offers remaining.

It reminds me a little of the famous Secretary Problem (see

So basically, to devise an optimal strategy, one would like to observe and, if possible, figure out the distribution of (offer – mean).


Josh October 30, 2006

Sorry for the confusion. I was not responding to you but to Chris. I’m not sure what the distribution of offers looks like, but I know that there is a general trend for the offer to mean ratio to increase during the game. It has been known to pass 1 (offer being greater than the mean), but it is usually much lower at the beginning.


jtown November 1, 2006

i want to know if any one has actually sat a tried to figured out what are the odds of what money is in what case like how many times the million is in case number 5 for example i would love to see those stats


blake November 3, 2006

so my last post only displayed abut a 1/5 or 20% or 0.2 or 1:5 of what i actually typed. but when it posted and only the first paragraph showed up, i figured, what ever.

but today i read more posts..and i noticed how mean they were. i mean, really, can’t we be more civil and treat people respect, as ends, and not simply as means to display our superiority to the word? i mean to change the tenor of this debate, meaning, to get back to the true point of the discussion, i.e., how to improve your chances of going home with a mean wad of cash. you know what i mean? but i guess its easy for the “average” joe to become fixated on the mean.

First, i agree with josh. beating the mean isn’t the point of the game, in any individual round or for the game as a whole. i know on first glance it seems to make sense, several scenrios have shown why this not how you should play. this is the simplest. beating the mean is simply havig an offer that is greater than the average. this could mean beating it by $1. under his rules if the offer is $1 greater than the mean you take the deal. it is not forward looking and doesn’t take into account what your odds will be in the next round. he buffers his theory by saying in the final rounds the offer will always be above the mean…and that any example where the offer is below the mean is unlikely or extreme. but this means accepting every offer in the final rounds. what is the point of saying take the offer if it is above the mean and refuse if it is not, and, by the way, it will always be above the mean in the final rounds thus you should always take it. if simply beating the mean is the point, and in the course of the game you passed up offers in the tens and hundreds of thousands that were unfortunately below the mean, but you accepted the final offer of $60 because only the $10 case and the $100 case remains, and $60 is above the mean, that you ‘won’ the game. i’d say, “wow, really? i guess i really aren’t so smart. cause i thought you lost.

again if the bankers offer was 400,000 and it was down to the $1 and 1,000,000 cases, chris’ rule says refuse. i know you say that that the offer will always be greater than the mean in that situation. that is very convenient for your rule, but doesn’t make for an interesting theory. utilizing your strategy for the above situation: 499,999 offer – refuse…500,001 offer – accept. your rule is absolute, mine is relative. mine says, for most people in this situation, if the offer is 400,000 or if it is 600,000, there isn’t much difference in how you formulate your decision. don’t be greedy, play it safe, take the deal and be asured of going home with a lot of money. I know its only equivalent to a weeks work or less for an MIT graduate, but that is alot for the median american family.

as for Josh’s theory that simply says play as long as the odds that the offer in the next round goes up…well that is nice..and may be more useful than Chris’ mean theory, however it doesn’t hold either. Here’s why. You have a situation where you only have 1 high value case and several very low value cases. the odds are in your favor in every following round that the odds will go up UNTIL the offer goes down, and then at that point who cares. The offers will tiny. 1 high, 6 low- odds in your favor. 1 high 5 low – still in your favor 1 in 4, 1 in 3, and 1 in 2. can you stomach playing 1 in 3 knowing that if your high number comes up, you go home with basically nothing? the odds are in your favor, your rule says play on. but how far do you push your luck when the odds of the offer going up are in your favor? if those 3 cases are 1,000,000 100 and 10, chris’ rule says stop if the offer is over 335,000 but play if it is 330,000. so that wouldn’t help me either.

obviously both strategies are flawed. but man, all this “i went to MIT” blah blah blah. i hate to break it to you, there is a bell curve at MIT too. ptolemy and marx proved no matter how intelligent you are, no matter how complex your theories and models are, if your basic premise is wrong, well what’s the point. throwing all your statistical tools ( i heard a lot of mention of them, but i didn’t really see any actually used) at a faulty premise, just tells me you should know better.

your use of the expectant value formula looks impressive. most of us see that and say wow, this guy is out of our league..but it just gives you the average, something we learned to do in grade school.

i can’t wait to read your scathing and viscious attacks on my meager third tier intelligence. i hope you back them up with logic, mathematical or otherwise.


Chris P. November 3, 2006

Guys, I never said this was a hard and fast rule.

The best you can hope for in the game is to beat the mean. You can’t plan on only removing the low dollar value cases in the beginning — case removal is completely about luck.

The only element over which you have any control is offer acceptance, and I’m saying you should never accept an offer that is less than the mean.

Obviously, if you have a case scenario where you can risk $80,000 to make a $200,000 upswing, then you take that risk, even if the current offer is greater than the mean.

But, in most of the scenarios described here, people are risking $200,000 with the hope of making about $60,000 greater than the current offer.

When the risk is less than the reward, only then do you decline an offer that is greater than the mean.

If the risk outweighs the reward and you decide not to accept an offer that is greater than the mean, then you’re playing with fire.

I hate luck, and Vegas has put it to me sans-vaseline on many occasions, so what I’ve suggested here is the way to play the game with the least amount of risk.


kevin November 6, 2006

I like your theory. I just wrote a program to play along at home and was looking for others’ thoughts on the math.
I will be testing my formualas for a while…

But, unfortunately this is not a mathemetical issue for the players.

There hasn’t been one player yet, I bet, who could calculate the average of round 2,3,4,5 or probably 6.
So If I were a player in the real game, I likely would be swayed by what I ‘thought’ my odds were.
Now the ‘game is on’. It is the ‘banker’/stats against lil ole me and my amateur ‘card counting’ skills.

Therefore while the producers’/banker’s goal is to ‘win’ by hedging against the mean with ‘nice offers’ for the poor suckers,
the players goal is to decide whether the offer is better than what they ‘think’ their odds are of increasing the offer.
The reason the offer is on par with the average when you get down to the last rounds
is that it makes it appear that the offer is ‘obvious’ to those folks at home that can do basic math :)
Most people will never know that the offers were less than any mean/avg.
Most people will never even think about the the offer being less than the mean because they believe that
it is all in the ‘odds’ which no one really understands.
They will say -‘Of course the offer is small in the beginning. They want you to play on.”
or “the odds are still small at this point”

At any rate, although the average at the start is ~131K, that is not the players starting point.
The players starting point is the first offer and you will never get an offer even close to the mean until
the 4th/5th round. At that point it could theoretically be too late for some poor players with bad luck.

just my POV.


barbara clutton November 6, 2006

would love to have a go


dw November 7, 2006

The end-game scenario presented can be approached in one of two ways. Clearly, taking the $80,000 offer in light of only one larger amount being available crushes the mean. The other angle, however, is that the percentage of picking something other than the $300,000 case was very much in her favor – either 75% if she had not already chosen the case as her own unrevealed prize, or 100% if she had. Given that elimination of a lower-level amount would likely have brought the next offer to something approaching, but just under, the arithmetic mean, it boiled down to no worse than a 75% chance of winning around $140,000.

That said, I don’t know I’d be willing to place an $80,000 wager on even a 75% chance… :-)

Great game show for math junkies!



Ryan November 11, 2006

I wrote a vb express prog to calculate the mean value, it only seems to become correct as the game progresses…have a look fellow programmers tell me what you think


blake November 14, 2006

Tonight’s game was another example of why the simplistic mean strategy promoted by chris p. and the odds strategy promoted by others just don’t make sense. the guy had a great board $75k, $100k, $400k, and $750k. but as he told Howie, he was greeeeedy. and greed took him down. now as i remember the offer was $261k (maybe $281). either way, well short of the mean (or expected value) of $341k. so according to chris p. – NO DEAL! and the odds that the offer would have gone up were in the 75% range, based on typical offers, as the game approaches the final 2 cases. so the odds players advice? NO DEAL!

now being the lowly community college graduate that i am, i looked at the board, looked at the not-tooo-unlikely worst case scenario, and looked at the offer and said- DEAL! (but hey, i’m dumb and couldn’t get into a top 3 engineering school :( or a second tier business school, dang it!)

well, next pick $750K. Awwwww! offer: in the 130’s. well below the mean again. and – NO DEAL!

next pick- $400K – Ooooh! Greed is good….Beat the mean! that’s the point! offer – $87K. well that’s pretty close to the mean but not quite. and what the heck right? n0 deal…

his case had the $100k in it. wait, the mean of his last 2 cases was $87,500… he did it! he beat the mean! woo hoo! he beat the mean!!! $161k less than his highest offer, but he beat it!!! That’s the point right???


Chris P. November 14, 2006


It’s not a mindless strategy. I saw the guy last night, and considering the four cases he had remaining — $75K, $100K, $400K, and $750K — I certainly think he did the right thing by electing to pick at least one more case.

The offer of $261K, which was well below the mean 0f $331K, was unacceptable. Typically, by the time three or fewer cases remain (especially if they are high-dollar cases), the banker will crack and come through with an offer that is well above the mean.

There is a very no-nonsense way to look at the risk involved in this scenario, so let’s check that out. The following bullet points illustrate the possible scenarios from the worst case to the best case:

  1. He removes the $750K case and is left with a mean of $191K.
  2. He removes the $400K case and is left with a mean of $308K.
  3. He removes the $100K case and is left with a mean of $408K.
  4. He removes the $75K case and is left with a mean of $416K.

Now, in order to determine the next move the man should make, we have to evaluate each of the scenarios above based on its risk.

Based on the worst case scenario, the man is risking $70K in an attempt to make more money. There is a 25% chance he’ll lose $70K, a 25% chance he’ll gain $47K, a 25% chance he’ll gain $147K, and a 25% chance he’ll gain $155K.

My own personal philosophy is that, while playing the game, it is not a good idea to risk more money than you stand to gain at any given time.

In our scenario here, the man was right on the fence. 50% of the time, he was risking more than he stood to gain, and 50% of the time, he was guaranteed to more than double his investment (which is his risk). All told, there was a 75% chance he was going to make more money, but I would only consider his investment to be “solid” 50% of the time — still not bad odds for such huge sums of money!

At this point, though, it’s a personal decision — do you walk away from the game when there’s a 75% chance you’ll make more money? Do you accept $70K less than what you’re worth at the time?

Would Terrell Owens play for a team that wouldn’t offer to pay him what he’s worth? ;)

Seriously, though. In this case, you could argue that any choice is the correct one. Statistically, it’s reasonable to continue playing the game. Obviously, it’s also reasonable to walk with $261K, because that’s a hell of a lot of money.

If it had been me up there playing the game, I would have elected to choose one more case, just as the man did in last night’s show.

Unfortunately, he removed the $750K case and was then offered a measley $144K, which was $47K below the new mean of $191K. Again, I would have removed one more case and ended up just where the man did last night — walking home with $100K.

A lot of folks have a hard time believing that this scenario is a positive outcome, especially since the man could have gone home with $261K. The fact of the matter is that you cannot control luck in the game, and unfortunately, the process of removing dollar amounts is pure luck.

It’s not fair to look at these situations in retrospect, because there’s no way to predict these things during the decision-making process (and of course, hindsight is always 20/20).

The bottom line for last night’s game is that the man had a 75% chance to gain more money (and a 50% chance to double his “investment”), and he elected to do so.

This decision is both understandable and supported by the underriding statistics, so it’s not reasonable to say that the man made a poor decision.

It’s like getting a bad beat in poker — the poor guy last night just took a bad beat in Deal or No Deal, and that’s the way it goes.


blake November 16, 2006


your last post is closer to the mark. but you’re exactly right, it is a game of luck. as i said earlier, luck provides the ceiling for your potential maximum winnings. however the real point is, how far do you want to push that luck.

i haven’t followed the game enough to be sure…and maybe you have all the games tivo’d and can go back and do the math, but i believe i have noticed a trend where the majority of those that go home with significant money tend to leave money on the table. meaning they often accept deals where the odds that the next offer will go up are still in their favor. and, by the way, offers that are below the mean, no less. losers tend to push their luck and go home with a fraction of what they could have if for not being quite so mean and greedy. (ok i beat that to death.)

to keep beating the horse that is dead, in a 1 in 5 situation, where there is one high and 4 low valued cases…chances are that you won’t pick the high case, but chances are that you also aren’t holding the high case, meaning that chances are that the high case will be removed in subsequent rounds before you get to the 50/50…so how far do you feel comfortable taking your chances. it seems most feel comfortable until they get to that 1 in 4 range. josh feels comfortable playing as long as the odds are in his favor (which is every proceeding round until the high case is removed) until you ultimately make it to the 50/50 ..your worst possible odds in this scenario. again…things are not quite so serious if you have back-up high cases. back-ups, back-ups.

anyway, the banker must be reading your blog, and be convinced that the contestants are as well, because the banker “forced” that guy into “No Deals” with “unacceptable” offers. not only was the $261 well below the mean… but the following offer of $144 was well below the mean again. he was forced to take it to the river.

this isn’t even an extreme example…he had a good board and was guaranteed to go home with money. the point is that basing your decision on the mean in order to remove luck from a game that is based on luck, simply isn’t the best strategy.

compare your last post with your previous posts. it’s nice to see you are coming around. now your modified strategy, outlined in that last post, is closer to a usable strategy by incorporating the odds. i’m not saying the mean is meaningless, it’s helpful in predicting offers, but as we’ve seen, is by no means determinant, at least not until the final round, where the offer is pretty reliably close to the mean. and the odds are helpful too…in telling you when not to push it to far.

i’ll make 2 analogies. playing russian roulette.. 1 bullet, 6 chambers. someone offers you 500,00 in the first round and will double your money each time there is just a click. ok…that’s a little extreme.

let’s use the football analogy. you like those. peyton manning is arguably the best quarterback in the NFL, he isn’t making the most money. let’s say the texans are willing to offer him 20% more, because that is what they believe he is worth…thus, the market declares that is what he is worth. (endorsements etc. all being equal). should he take it? they have about the worst offensive line in football, his career will probably be shorter. you have to take a comprehensive approach, risk also has a qualitative aspect. a lot of players accept less money than they are worth and end up having better careers (health/financial/personal) than those who don’t.

for all those people that accepted amounts below the mean, or when the odds were in their favor to continue but didn’t, i guess the downside of coming up unlucky outweighed the upside of the marginally increased offer. meaning marginal in the sense that they’d be risking the loss of a life changing amount of money (major) in exchange for a chance at a “more” life changing amount of money (not as major). sorry but i think they are pretty smart.

there is a funny little saying that goes something like this – in theory, theory and practice are the same. in practice, they are not.


Mike Anderson November 17, 2006

Meal or No Meal – Spoof

A cheerful host and encouraging peanut gallery are there to keep the game lively with humor and the most melodious accents in the world.

Let’s play!


EssBee November 19, 2006

Only just found this thread and I’ll comment from a UK perspective where the numbers in the boxes are different (amd smaller) than in the US.

As far as I can see, what has not been mentioned here is that the “Banker” is a person put in place by the programme makers. I have not recorded teh shows in detail to test my hypothesis, but I am pretty confident that the Banker, at least in the UK, does not follow a clearly defined and fair algorithm. One reason for this opinion is that I have seen shows where the offer has varied wildly from one round to the next and has gone from values near the mean to tiny values without good statistical cause. I watch the show with a running mean of the box values in my head and early in the show the Banker always offers a lot less than the mean.

Si, I think we need to forgetthe idea that it is a ‘fair’ show in the same sense that a roulette wheel should be fair. Nonetheless it seems to me that as soon as the offer exceeds the expected value, this should be taken, not least because, in the UK version, most people could happily walk away from deals much below the game’s opening expected value of around £15,000 (if memory serves). I think then that anyone being offered £20,000 should take it on statistical grounds, and anyone offered much less than £15,000 might as well play on even if the odds are declining.

I think this shows that that the absolute value of the boxes has an impact. Most people in the US would not want to mess up winning something close to the mean of $131,477.54, whereas in the UK you need to win closer to the upper end of the possible values for it to be a “life-changing” sum in the words of the programme-makers


Shawn November 24, 2006

I think Josh and Chris together come very close to the ideal statistical strategy. Let me see if I can pull it together in a way that satisfies everyone.

First, let’s define the player’s goal. I’m going to assume that the player’s goal is to maximize his likely earnings. Not to maximize his *possible* earnings, mind you, but his *likely* earnings, using “likely” in the statistical sense. Assuming the utility of money is a linear function of its amount, and assuming the player wishes to maximize utility, that’s the most rational strategy possible.

At each decision point in the game, the player has to choose “deal” or “no deal”, based on the banker’s current offer and the money remaining on the board.

Chris says that if the offer is higher than the mean of the money on the board, the player should take the offer. Chris is right (assuming a rational banker) but his rationale for why this is right, is wrong.

Josh is right that the player’s decision isn’t related so much to whether he’s ahead of the mean or not, but whether his likely income will be increased by continuing to play.

What the player really needs to consider, as Josh says, is whether the next offer will likely increase or decrease. But the question is more subtle than just going up or down. If there’s a 60% chance of the offer going up by a small amount, and a 40% chance of the offer going down by a large amount, it’s a bad bet even if the balance of odds favor going on.

What the player needs to consider, then, is the statistical expected value of the next offer. The exact offer is unknown, but it’s some perturbation from the next mean, and we can easily calculate the expected value of the next mean, by considering each possible choice, calculating that mean, and then calculating a weighted average of all of those future potential means.

If we have some knowledge of how the banker adjusts the offers from the means, that should be factored in as well. Assuming that we don’t, though, the player is best served by comparing the current offer to the expected value of the next mean. If the expectation is that the next mean will be higher than the current offer, the player should continue playing.

That formulation is, I believe, one that both satisfies Josh’s point that the crucial issue is the decision as to whether playing on increases the expected mean or not, and satisfies Chris’s desire for a purely rational, mathematical strategy.

Here’s the interesting bit:

This is conceptually different from Chris’s strategy, but numerically identical. Why? Because the current mean equals the next expected mean. It’s easy to show algebraically, or you can just try some numbers in a spreadsheet.

So, if the banker generally sticks close to the mean then the player’s best strategy is to take any offer that exceeds the current mean, because that offer also exceeds the expected value of the next mean (and therefore the next offer).

Given that the banker is known to diverge from the mean, the game strategy really boils down to one of outguessing the banker. He’s offering a premium on this deal… will he offer a significantly larger premium on the next, or is this as good as it’s going to get? Then there’s also the human element — the fact that utility is not a linear function of wealth, and it’s a function that varies from person to person even among the most purely rational of people.

Those two elements are what make the game interesting.


Steve November 25, 2006

Very interesting debate. Special thanks to Chris, Josh, Blake and Shawn for taking the time to distill these questions into concrete thoughts to satisfy the curiosity of a bored web surfer.

I would like to add an additional question that may help us to explain the banker’s divergence from the mean, especially in later rounds of the game.

I’ve yet to see any claim that the banker’s offers are made independently from the knowledge of what is in the “pocket” case. That is, can we assume that the bank is blind, or is it possible that the bank knows from the first moment what the player has selected as his or her case?

Assuming the banker has such knowledge, is there empirical evidence from our “trials” (i.e. broadcasted programs) to suggest that the banker’s variance from the mean (particularly those moments his offer exceeds the mean) correlates with the value of the pocket case? Knowing that the pocket case cannot be eliminated adds an additional element to the game, and knowing its value gives the banker an additional edge.

For instance, in the case of three remaining cases: $1, $5, $1M, if the “pocket” case is worth $1M, does the banker statistically offer more than the mean to guide the player toward a deal? Likewise if $1M remains on the board, are offers statistically lower in an effort to encourage the player to continue and hopefully eliminate the most valuable case?

A more specific example, in the case of the original post ($100, $400, $1000, $50,000, and $300,000), the banker’s offer was $80K. Player said no deal, and went on to eliminate $300K in the next pick. If, however, her pocket case had contained $300K (thus making it “unplayable” until the final round), can we find evidence to suggest that her offer would have exceeded $80K?

Unfortunately I don’t have the means (Tivo) to examine this question myself, but I am nonetheless very curious whether a player can derive additional information about the board from the banker’s deviation from the mean.

Of course, even if he knew, that SOB could always bluff…


robert November 26, 2006

there is a game called MEAL OR NO MEAL to play the game log on to google and type meal or no meal

u can win up to a strawberry cake


Tom Bradea November 27, 2006



Karen November 28, 2006

On a slightly different note –

Everybody keeps assuming (on the show and everywher else), that each time a new case is opened the probabilities continue to reset themselves. What I mean is that we start out with 26 cases, but lets say that we are down to 5, with only one large amount left, then Howie tells you that you have a 20% chance at that amount.

I am not so sure this is true, particularly considering the well known Monty Hall Dilemma , this case seems to be very similar.

To itirate what I mean, I would need to simplify the situation a bit. Lets say we had one case with a Million and the rest was lower sums. Then when we first pick a case, our probability to get the million is 1/26. According to monty hall, this probablility would remain thoughout the game, even as we open up more cases. And lets say we opened 24 cases and they were all lower amounts (extremely unlikely, but just for the sake of argument). Then, would it be more likely that the million is in your case or in the other case? Accordion to Monty Hall, the probability is far from 50-50. Your case maintains a 1/26 probablity while the other case has now assumed the probabilities of all other cases that were opened! Meaning the other case has a 25/26 probability of having the million in it. Meaning that if you had the choice to switch, you better switch!

Now in the real game of course we have more than one large sum. So depending on what you consider to be large, the probabilities will adjust. For example, lets say that we define a large sum to be anything bigger than the mean of that 131,000 or whatever. So we would consider a large sum to be 200,000 and up, 6 cases total. When we first start we have a 6/26 probability of having one of the large sums in our case. It would make sense to me that this probability remains through-out the game, until we lose all or some of large sums. And if we lost some of the large sums, would the other ones assume the probability of the sums that were lost? This is where I get lost in this argument.

Now that I have confused myself and everyone else I would be happy to hear your thoughts. How is this game different mathematically from Monty Hall? Is this just a more complicated version of the same game or is there some factor that I am overlooking that changes the picture completely?


Shawn November 28, 2006


What you’re missing is two things:

First, this is nothing like Monty Hall. What makes the Monty Hall problem interesting is that the player is given additional information by the host, because the door the host opens is not chosen randomly. The host knows where the goat and the car are, and so his choice of door gives the player an additional piece of knowledge.

Second, the explanation you’ve read as to why the Monty Hall player is better off switching is wrong. Well, perhaps not wrong, but oversimplified to the point of inaccuracy. Probabilities aren’t things that can be picked up and moved from place to place, it just happens that in the context of the Monty Hall problem, they behave as if they are.

Probabilities are mathematical functions that estimate outcomes of repeated trials with random variables, and there are mathematical rules that govern how they can be manipulated and composed. There are also methods of estimating revised probabilities based on additional information. In the case of Monty Hall, it works out that the revision after Monty opens the door looks like just “moving” the probability from the opened door to the unselected door. But the actual mathematics is more subtle and the approach doesn’t apply in all cases.

A decent explanation of how to apply Bayes’ Theorem correctly to the Monty Hall problem is at:

If you want a simpler explanation of Monty Hall that doesn’t require invoking the complexities of Bayes’ Theorem, try this one:

What is the probability that you chose the wrong door? 2/3. So you know from the outset that your probably chose badly, and the odds are that the car is behind one of the doors you didn’t pick. But you don’t know which one. If you could open both of them, you’d have a 2/3 chance of finding the car. If Monty opens one and you then switch to open the other, you have opened both of them and realized your 2/3 chance. If you keep your original choice, well, it always had a 1/3 chance of being correct, and that hasn’t changed.

To formulate that a little more mathematically, assume that after you pick your door you mentally divide the doors into two sets. Set A contains the door you picked, set B contains the other two doors. It’s clear that there’s a 2/3 chance that the car is in set B — P(B) = 2/3. When Monty opens a door in set B, he doesn’t change the probability that the set of doors contains the car. It is still 2/3. What has changed is that you now only have one choice in that set.

The key point, though, is that Monty gives you information because of which door he picks. If he were picking randomly then he wouldn’t be giving you any information about the door he didn’t pick. All you would be able to conclude from his choice is that the car isn’t behind the door he chose, leaving you with two doors, each with probability 1/2.

To apply this to DOND, we’d have to alter the game and have Howie pick some cases based on his knowledge. Suppose that:

(1) The cases are shuffled randomly, so there’s no bias in which one you get.
(2) Howie knows only where the $1M case is.
(3) Howie has 24 cases opened, avoiding the $1M
(4) Howie then lets you choose whether to take a deal (roughly the mean of $1M and whatever other case is still in play) or open the remaining case.

In this case, the odds that you were given the $1M case are 1/26. The probability that the $1M case is on the stand is 25/26, and after Howie has helpfully eliminated all but one case from the set on the stand, that’s still true.

In that scenario, you should absolutely take the deal, because the odds are very high (96%) that by opening the remaining case you’re going to reduce your winnings.

Keep in mind, though, that this only applies if Howie is doing the choosing, based on his knowledge of where the $1M case is. If you’re just picking randomly, then there’s no reason to distinguish between the set of cases on the stand and the set containing your case.

To summarize, the Monty Hall “effect” only comes into play when decisions are made based on hidden knowledge. Those decisions based on knowledge give additional information to the player, beyond just what would be revealed by the same selections made at random.


Shawn November 28, 2006

One more comment about the scenario where Howie picks the cases. Assuming the banker is smart, he also knows that there’s a 25/26 chance that the case on the stand contains $1M. Therefore, it’s safe to assume that he’ll modify his offer accordingly. If the other case in play happens to be $0.01, then the expected value of the offer is $38,4612.

Statistically, even in this scenario, if he were to offer you less than $38K, you should take the chance and open the remaining case. In practice, you have to decide whether you’d rather have a 4% chance of getting a million, or a 100% chance of getting the offered amount.


Karen November 28, 2006

Thanks Shawn, that makes a lot of sense.


blake December 1, 2006

latest example of how playing the mean/odds doesn’t pay. the guy had $500k, $100 and $10 and the offer was $142k. below the mean of $166k and the odds are still favorable. the sisters pleads deal, the brother, no deal…what did he do? no deal…walks with $10. i hope we all learned something.


Shawn December 1, 2006

Playing the mean does pay in the long run — meaning over the course of multiple games. Given that you only get to play once, though, it may not be worth the risk.

It’s also worth noting, though, that waiting for an offer that exceeds the mean is a very risky strategy, because the banker so rarely makes such offers. Usually, waiting for such an offer will mean playing to the end, or very nearly the end, of the game. The end of the game is where the huge payoffs are, but it’s also where the tiny payoffs are. If all of that money actually means something to you, you should deal earlier.


Chris P. December 1, 2006


Based on my risk/reward argument above, I would have taken the deal when $142K was offered.

The guy was essentially risking $142K in hopes of pulling his mean up to $250K, or to put it another way, he was risking $142K to earn $108K.

That late in the game — with that much on the line — it would be ridiculous not to take that deal.

You try to beat the mean unless you end up in a situation where you’re “upside-down,” risking more than you stand to gain.

Especially if it means you could walk away with nothing, just like the poor sap in your example.


Mikey Benny December 3, 2006

Here’s my only problem with your analysis: the woman had a 5/6 of increasing her winnings. Her decision was not *that* bad — in fact, I probably would have gone one last time. You don’t play for your briefcase; you try to maximize the bank’s offer. You completely ignore that. If she picks ANYTHING other than the $300,000 briefcase, the banker’s offer goes substantially up.


MistaC December 4, 2006

I like the post for the most part. Expected value is one of the main issues in this game. One often overlooked aspect is utility. When risk comes into play, there’re three types of people, risk loving, risk neutral and risk averse. In the case of risk neutral people expected value is the only thing one needs to look at. Most people are risk averse. Risk averse people are not willing to make a fair bet, for they would rather be guaranteed nothing than making a bet with EV = 0. Risk loving people are willing to make an unfair bet (as in almost anyone who plays in casinos or lotteries) My point is this: when the banker’s offers start to get high, even if they’re less than the EV, it is often a fully rational decision for someone to take the deal. Let’s say there’re two cases left, one with a million, one with $1 with a banker offer of $350k. EV = $500,000.5 if the person’s utility function is something like U(x)= squareroot(x) then U(350,000)=591.61 while the utility from not dealing is [U(1)+U(1,000,000)]/2 = 500.5
In this case, based on this person’s risk aversion, s/he’s better to deal. Keep in mind that these utility numbers are not dollars, they’re more just general units for comparison. The reason why i chose sqrt(x) as the function is that utility functions for risk averse persons are concave, while those for risk averse persons are convex, such as x^2. Risk neutral functions are straight lines, which is why for risk neutral people only worry about expected value.


Matou December 9, 2006

I played right to the end, I had $1,000,000 and $75,000 left and my offer was 392,000. I was a little disappointed that it doesn’t let you choose between the remaining case or your own, it makes you open your own, but I had the million so it didn’t matter LOL


Matou December 9, 2006

oh and I don’t think it’s safe to assume the banker uses the mean to calculate the offer, or that aiming for higher than mean is the reason to take the deal.


cpmgrp December 12, 2006

Tonight’s (12/12) mailman was presented with a late-round costume change as the models reappeared in USPS iniforms. As they panned over the models faces, only one appeared to say anything while the audience was welcoming them. It appeared the woman holding case 21 was finishing the word, “Hi.” Her mouth was wide open and (to my amateur lip-reading it sure looked like it). Turned out she was chosen next and had the only big number left. Could she have instead been saying, “High?” Coincidence? I don’t think so. Was there a subtle attempt to warn him? I have SLO-MOed the bit and it holds up. Those who Tivo’d, check it out.


Grant J. December 26, 2006

Shawn, I don’t think your conclusion about the Monty Hall problem not applying is correct.

It’s true that the Monty Hall problem is based on a change in information, but whether that information was previously known has nothing to do with it. For instance, in the classic problem where Monty Hall picks the door to reveal nothing, it doesn’t matter whether or not Monty knew it. The only thing that matters is the fact that it has been revealed to have nothing. How we got to that state is of no consequence. The probability of that door having the prize still goes to zero and the other door’s chances still increase.

In DOND, the contestant is the one revealing the cases (i.e. doors), but it is still the Monty Hall problem. Information is still revealed from the cases. We still find out whether or not the case had a high value in it.

The second major aspect in the Monty Hall problem is that the initial decision is made in a scenario with certain probability of success, and that value does not change. This is still the case in DOND. In short, the case that he/she singles out at the beginning of the game is independent of the rest of the remaining cases.

What does this mean for us?
It means that when you pick a case at the beginning of the show, it has a very specific expected value. And that expected value *never* changes throughout the show. It is the sum of each case value multiplied by the chance of that case value (which, because every case has the same probability of being chosen, the expected value becomes the total value of cases divided by the number of cases).

Expected Value = $3,418,416.01 / 26
EV = $131,477.53

That’s the expected value of the case you’re holding at the start of the game. And it’s still the expected value of the case you’re holding at the end of the game.


Grant J. December 26, 2006

Forgot to hit the notify checkbox.
Let me know if anyone has a more complete explanation of the probabilities involved. I was watching it tonight and I’m still trying to figure out exactly how they determine the banker offer. :-P


Adam December 26, 2006

Didn’t read all these comments so somebody might have addressed this, but I disagree. Your statistical analysis is correct, but in gambles you have to consider each individual person’s utility function and whether they are risk averse, neutral, or seeking.


Robo_Mojo December 28, 2006

First of all, Josh, you make an assumption about the banker’s strategy

“If your mean is likely to go up next round, then your offer is likely to go up next round.”

I think what you (Josh) are missing is that Chris’s strategy does not make ANY assumptions (nor even speculates) about what the bank offer is “likely” to be. This is a major difference between Chris’s strategy and yours.

What Chris is saying is you should reject any offer below your mean no matter what. But this requires that you completely discard any expectation of what the banker’s offer will be in each round. This is a purely statistical approach that doesn’t assume anything about the banker’s strategy at all. But this doesn’t mean that it is an “optimal” strategy, which I will explain later.

David is more on the right track when he says

“In that case, one would need to know the distribution of (offer – mean) to devise an optimal strategy.”

If on the other hand the banker was completely predictable, you could calculate (predict) the mean offer in the next round. You could make accurate calculations and throw out the whole “Beat the mean” and any other strategy all together.

But we don’t know the complete strategy that the banker uses (as someone mentioned earlier, it is somewhat based on “luck?” or maybe it even has some random elements, I do not know and I don’t think any of us do either, yet) then we must take a different approach. He is NOT completely predictable, so we can’t make accurate calculations, and so we must apply some strategy.

“Beat the mean” is probably as good as any strategy if the banker’s strategy is completely unknown. We do know something (statistically from past observation) about the general strategy the banker uses (start at substantially smaller than the mean and work up to 100% and sometimes exceed it). “Beat the mean” is probably safe enough to use in DOND, if you’re a betting man. For what it’s worth, I would use “Beat the mean”, given what we know about the banker’s past games.


Read up on GAME THEORY to get a better understanding about what I mean by that. You cannot arive at a single most optimal strategy by purely statistical means in DOND, as the best competing strategy will depend on who it is competing against, the banker, who has an entirely separate strategy (whether or not it is a known strategy).

For example, consider a banker that always starts at $50,000 and increases by $50,000 in each subsequent offer. His strategy then is completely independent of your mean, and you may as well wish to pass an offer even if it is greater than your mean, because for all you know the offer will continue to go up regardless of what case you open. If this were the banker’s strategy, you could devise a better strategy than “Beat the mean” with a greater average case.

If that were the banker’s strategy, then a better strategy for you than “Beat the mean” would likely be a “Pass offers #1-#8 and take offer #9” strategy. Would it not? There may even be one better than that.

Obviously that is just a fantasy, though. The banker’s strategy doesn’t actually appear to work that way, of course. I merely bring it up as a possible banker-strategy scenario where BTM is not optimal. And given that we don’t fully know the banker’s strategy, I can’t conclude that BTM is always the best strategy to use.

Next, Josh also says

“If you had $1, $5, $10, and $1,000,000 left on the board, you shouldn’t take an offer of $400,000, because you have a 75% chance of eliminating one of the small amounts and having a much higher offer next time.”

Here you are assuming something about the banker’s strategy (the part about “having a much higher offer next time”). Let’s consider that.

If you used the “Beat the mean” strategy, you would take the $400,000 offer because it is greater than the mean (about 60% greater, which is substantial) and that is all we consider.

Take this example, take your scenario again, $1, $5, $10, and $1,000,000, except this time the offer is $900,000 (absurdly high, but then again I’m not assuming anything about the banker’s strategy). Here the order of the values are the same, and you still have your 3/4 chance of picking something smaller than the offer, which might (using your assumption) make the offer go up higher. Will you still take the chance? Or would you take the $900,000?

In your (Josh’s) strategy, what’s the difference whether the offer is $400,000, $900,000 or $1,000? They all fall in the same order of the possibilities on the board (between the 3rd and 4th values in numerical order) which is what seems to be important to you. This is all assuming you know something about the banker which may be untrue (that is, the offer goes up if you pick something smaller).

In your strategy, you are associating an arbitrary “positive” risk with 3/4 odds and an EQUAL “negative” risk with 1/4 odds and concluding a net positive result. However, the risk associated with opening each case varies. By risk I mean only the change in your next offer. The mean expected value of your case after opening more cases (the “mean mean” if you will, after opening the next cases) remains the same as your current mean.

Take this for example: You have the values $300, $500 and $1,000. Your mean is $600. Here are your outcomes:

1) Take away the $300 (leaving $500 and $1,000) and your mean then will be $750.
2) Take away the $500 instead (leaving you with $300 and $1,000) then your mean is $650.
3) Take away the $1,000, (leaving ($300 and $500) then your mean is $400.

Your “mean mean” then, or the mean of the outcomes is the mean of $750, $650 and $400, which is $600, same as our current mean of $600. See what I mean? (sorry, I couldn’t resist)

So there is no statistical risk in changing our expected value, that risk doesn’t exist (it is zero). The risk is only in how the banker’s strategy interprets your board on the next round.

In our game, we can only guess what the risk is, since the banker’s strategy isn’t completely predictable. This is the whole point of the problem.

This post was rather long, but I hope that it makes some sense and sheds some insight. Too bad most of us would never actually make it onto the show, but at least we can screw around with the numbers anyway and have fun.

Besides, we don’t even know the exact strategy that the banker uses, and that affects how we choose our strategy. Why then would we use a strategy that is predictable all the time? Is it a fact that being predictable has no effect on your expected return? Or does it? That may be another problem, but I won’t go any further with it.


Jason December 28, 2006

package test;

import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import java.util.Random;

public class DealOrNoDeal {

static List amounts = new ArrayList() {
add(new Double(.1));
add(new Double(1));
add(new Double(5));
add(new Double(10));
add(new Double(25));
add(new Double(50));
add(new Double(75));
add(new Double(100));
add(new Double(200));
add(new Double(300));
add(new Double(400));
add(new Double(500));
add(new Double(750));
add(new Double(1000));
add(new Double(5000));
add(new Double(10000));
add(new Double(25000));
add(new Double(50000));
add(new Double(75000));
add(new Double(100000));
add(new Double(200000));
add(new Double(300000));
add(new Double(400000));
add(new Double(500000));
add(new Double(750000));
add(new Double(1000000));

public static void main(String[] args) throws Exception {
if (args == null || args.length == 0) {
} else {
System.out.println(“\nGAME OVER!”);

static void simulation() {
while (!amounts.isEmpty()) {
System.out.println(“Expected value: $” + computeMean());
double randomNumber = Math.random() * 1000000d;
long randomSeed = Math.round(randomNumber);
int index = new Random(randomSeed).nextInt(amounts.size());
System.out.println(“\nRemoving: $” + amounts.get(index));

static void interactive() throws Exception {
while (!amounts.isEmpty()) {
System.out.println(“Expected value: $” + computeMean());
System.out.print(“\nPlease enter value to remove: $”);

BufferedReader br = new BufferedReader(new InputStreamReader(;

Double in = null;

try {
in = new Double(br.readLine());
} catch (NumberFormatException nfe) {
System.out.println(“Input must be a number.”);

if (!amounts.contains(in)) {
System.out.println(“Input value not found.”);


static double computeMean() {
double mean = 0;

for (Iterator it = amounts.iterator(); it.hasNext();) {
mean += ((Double) * (1.0 / amounts.size());

return mean;


Kevin January 7, 2007

Thank you Karen for bringing up the Monty Hall problem and Grant for clarifying. YES the Monty Hall problem comes into play! Wheather Monty knows where the door is or not does not matter (unless he opens up the door that has the prize then it is a moot point). Let’s say the million is the only one that counts then when the contestant picks they have a 1/26 chance of having the million. This does not change throughout the game! There is a 25/26 chance the million is up with the ladies. This also does not change throughout the game also unless the million is disclosed. If at the end of the game there was only $10 and the million it is still 25/26 chance that the million is with the lady. If you were asked to switch you should switch!


Josh January 8, 2007

The Monty Problem, as Shawn said, does not apply. I could explain why, but I might as well apply it over to DoND so I don’t have to go through it again. When you pick your first case, you have a 1/26 chance of choosing the million. Monty is contingent on the fact that you know that the big prize will still be available at the end of the game. Here, though, it’s not true. Since we assume cases are opened randomly, there is a 24/26 chance that the case with the million will be opened before the last round, and a 2/26 chance it will not be opened (because one case is remaining in the game and one case is with the contestant). Since we know our chance of having it is 1/26 and the chance of it surviving to the last round is 2/26, we can conclude that, if we have reached the last round and the million dollar case has not been opened, we have a 1/2 chance of having it. Why is this different than Monty? Well, the 24/26 chance that the big prize will be opened before the end doesn’t exist in Monty. In that one, we know that the host will open a case that does not contain the big prize, so there is a 0% chance of it being eliminated prematurely. That would leave us with a 26/26 chance that the case will be present at the end, so we only have a 1/26 chance of having it.

Let me try to put it more clearly. When you look at statistics that predict the chances of a series of actions, you multiply their probabilities. For example, if I asked you the chance of rolling a 3 with a die, your answer would be 1/6. If I asked the chance of rolling a 3, picking it up, and rolling another 3, the probability drops to 1/36 (1/6 times 1/6). Similarly, if I asked what the chance was of rolling a 3, picking it up, and rolling anything except a 3, the probability would be 5/36 (1/6 times 5/6). Now, let’s look at a simplified case of DoND that is close to Monty. Let’s say we only have 3 cases (I’ll call them boxes to avoid confusion). There are a couple different situations if you don’t assume anything about how the host chooses. Let’s just say that the boxes hold $1, $5, and $mil.

1: You pick $1 and host opens $1 (1/3 * 1/3 = 1/9)
2: You pick $1 and host opens $5 (1/3 * 1/3 = 1/9)
3: You pick $1 and host opens $mil (1/3 * 1/3 = 1/9)
4: You pick $5 and host opens $1 (1/3 * 1/3 = 1/9)
5: You pick $5 and host opens $5 (1/3 * 1/3 = 1/9)
6: You pick $5 and host opens $mil (1/3 * 1/3 = 1/9)
7: You pick $mil and host opens $1 (1/3 * 1/3 = 1/9)
8: You pick $mil and host opens $5 (1/3 * 1/3 = 1/9)
9: You pick $mil and host opens $mil (1/3 * 1/3 = 1/9)

As you can see, we have 9 cases, each with a 1/9 chance of occuring. This, of course, adds up to 9/9 to represent all of our options. Now, we make some reductions because we know that not all of these are possible. Since each value occurs only once, we know that the host can’t open a value you have already chosen. This changes things this way:

1: You pick $1 and host opens $1 (1/3 * 0 = 0)
2: You pick $1 and host opens $5 (1/3 * 1/2 = 1/6)
3: You pick $1 and host opens $mil (1/3 * 1/2 = 1/6)
4: You pick $5 and host opens $1 (1/3 * 1/2 = 1/6)
5: You pick $5 and host opens $5 (1/3 * 0 = 0)
6: You pick $5 and host opens $mil (1/3 * 1/2 = 1/6)
7: You pick $mil and host opens $1 (1/3 * 1/2 = 1/6)
8: You pick $mil and host opens $5 (1/3 * 1/2 = 1/6)
9: You pick $mil and host opens $mil (1/3 * 0 = 0)

Again, we have a total of 6/6, which represents all of our options. Wondering where the 1/2 came from? After you open one case, there are just two remaining to choose from. If we assume cases are chosen at random, then each remaining case has a 1/2 of being chosen. Now, what could happen from here? There is a 2/6 (1/3) chance that the $mil case is going to be opened before we ever get to the final round, shown here:

3: You pick $1 and host opens $mil (1/3 * 1/2 = 1/6)
6: You pick $5 and host opens $mil (1/3 * 1/2 = 1/6)

The other option is that you make it to the final round with $mil still floating around somewhere. These are shown here:

2: You pick $1 and host opens $5 (1/3 * 1/2 = 1/6)
4: You pick $5 and host opens $1 (1/3 * 1/2 = 1/6)
7: You pick $mil and host opens $1 (1/3 * 1/2 = 1/6)
8: You pick $mil and host opens $5 (1/3 * 1/2 = 1/6)

As you can see, there is a 2/6 (1/3) chance of you making it to the final round and having the $mil and a 2/6 (1/3) chance of you making it to the final round and not having the $mil. As 1/3 equals 1/3, your chances are no better regardless of whether you switch or not.

So why is this different than Monty’s scenario? In that one, they eliminate the possibility of the host opening the big prize. He specifically picks one that is NOT the $mil. Here’s how that would change us (again, he also can’t pick what you have already chosen):

1: You pick $1 and host opens $1 (1/3 * 0 = 0)
2: You pick $1 and host opens $5 (1/3 * 1/1 = 1/3)
3: You pick $1 and host opens $mil (1/3 * 0 = 0)
4: You pick $5 and host opens $1 (1/3 * 1/1 = 1/3)
5: You pick $5 and host opens $5 (1/3 * 0 = 0)
6: You pick $5 and host opens $mil (1/3 * 0 = 06)
7: You pick $mil and host opens $1 (1/3 * 1/2 = 1/6)
8: You pick $mil and host opens $5 (1/3 * 1/2 = 1/6)
9: You pick $mil and host opens $mil (1/3 * 0 = 0)

Here, there is a 6/6 chance that the $mil will make it to the end, because the Host won’t pick it. Now, here are the final options:

2: You pick $1 and host opens $5 (1/3 * 1/1 = 1/3)
4: You pick $5 and host opens $1 (1/3 * 1/1 = 1/3)
7: You pick $mil and host opens $1 (1/3 * 1/2 = 1/6)
8: You pick $mil and host opens $5 (1/3 * 1/2 = 1/6)

As you can see, there’s a 2/3 chance that you picked a low value and a 1/3 chance you picked the $mil. Since there is only one hidden value (the box left unopened), you should take that one. Because of the way boxes were eliminated, the one you DIDN’T originally choose has taken on a 2/3 chance of being the big money. As you can see, the big difference is whether or not the boxes are chosen to avoid the $mil. If it’s random, the probabilities update as you play. If they eliminate them in a predictable way that does not allow the early removal of the big value, then the Monty system works. As we all have seen, though, there are many times that the $mil is taken off the board before the end of the game.

In addition, let’s consider what it would mean strategically if Monty applied here. If you assumed that the $mil wouldn’t be revealed until the results of the final round (an assumption we know to be false), then the only reasonable strategy would be to “No Deal” every round and then deal on the last round. This is because you know that keeping the $mil on the board would keep offers increasing, but you also know you have a 1/26 chance of holding the $mil. Luckily for everyone who gets to the last round without losing that $mil, this isn’t the case at all. In fact, as I just showed, there’s a equal chance of having it.


Jake January 8, 2007

Fantastic analysis. It would really benefit contestants to actually learn a little about statistics before appearing on the show. Sometimes there’s something to eb said for going with something other than an automatic gut feeling. Anyway I was originally here searching for the deal or no deal flash game, but I’ve found it elsewhere now anyway :). It you’d like some light relief from the stats check out this flash version of the game:


Kevin January 8, 2007

Thanks Josh. You’re Right. I didn’t consider the odds of the million becoming disclosed. If there are N cases and you did not pick the case with a million dollars then the odds of ending up with two cases (your case and one with the ladies) and the million still in play is (N-1)!/N! or 1/N. For 26 cases then the odds of having the million still in play when there are two cases left (if you did not pick the million) is 1/26 and of course the odds of you picking the million originally is also 1/26 so it would be it would be equal odds the million would be with you or the lady. It also shows that the odds are high that the million will be disclosed.


Luke January 10, 2007

So my intial thought from watching this show is that it is just based solely on the mean or expected value at any point in time (which is somewhat validated by the fact that the offers move toward the mean). However, after reading Josh’s analysis, it made me consider that maybe his theory was correct. Here’s why, I agree that if there are x number of cases remaining and I had to choose one case to be the one I keep, then the expected outcome is obviously the mean. As Josh argued, though, in the show, you are picking a case to REMOVE from the possible outcomes and resulting in a new offer. Therefore, I could see that the algorithm to determine the optimal solution may be different.

However, after thinking about it more, I had to write this note to refute his theory. Basically, his theory is that as long as you have more suitcases that are lower than the offer, then it is in your interest to continue on. The problem with his analysis is that he only considers the possible reward and disregards the risk. To make this argument even easier, I will use 100, 200, 500, and 1000000 and make approximations. In this case, the mean is around 250K (small values can basically be ignored). Let’s assume that the offer is 250K. Based on Josh’s theory, we should continue on because we have 75% chance of picking a suitcase


Luke January 10, 2007

dammit, it cut off my response…(continued)

that has a lower value than the current offer. However, let’s examine the possible outcomes:
1) pick a small value suitcase – mean improves to 333K an improvement of 83K
2) pick 1M suitcase – mean drops to 267, a loss of 249K

What does this mean? This means that your expected improvement by continuing is 83K x 3 – 249K = 0. Therefore, by continuing on, the expected outcome by picking another suitcase is actually 0! That means that at any point in time, the expected value is in fact the standard on which to measure your next move. I hope that makes sense. Not that it matters, but I did go to MIT (and I still could be wrong!).


gemma January 11, 2007

omg watching deal or no deal on 10.01.07
he had 1 red and 4 blues and he won a blue but i dont know which one


Josh January 11, 2007


Shortly after I did those posts, I noticed the same thing – that the real expected value of the upcoming means was equal to the current mean. It’s true that they might have a 75% of the mean increasing, but as you pointed out, the expected mean doesn’t change. This, of course, brings up a whole new issue. If you want to consider taking a deal based on upcoming offers but have to conclude that the mean is going to stay the same, what do you do? You can do what the original post said and “try to beat the mean”, but it seems like this might never happen and, if it does, there is no guarantee that this will be the right choice.

I think what it comes down to is understanding the banker, which no one really seems to be able to do. I know there is a **general** trend for the offers to represent succesively greater portions of the mean (first offer might be 15% of the mean, then 26%, then 49%… ). In fact, I’ve seen the offer go up even after the mean decreases because of this “increasing portion” trend. Nevertheless, it isn’t always true. I can say with certainty that the ratio of offer to mean is much better near the end of the game than at the beginning, though. I can’t remember where I saw it, but another site had a listing from several people showing what value of the mean was offered with each round. It seemed to start between 10-20% and occasionally peak as high as 115%, which was near the end but not necessarily at the last round.

So here’s my thinking. Statistically, we can show that the mean we can expect on the next round is equal to the mean we have now. The decision to continue vs “deal” should be based solely on whether it is likely that continuing will increase the sum we will take home. Until the last 2-3 rounds, I think you can expect the offers to be too low to be worthwhile. In last rounds, though, there doesn’t seem to be a perfect moment that we can guess before time. So basically, my plan would be to “no deal” until there are 2-3 cases left on the board. From there, I’ll do whatever seems good. If I have $1, $750k, and $1mil and I am offered $500k, I”ll probably deal, not because I am unlikely to have more in my case (67% chance of that, right?) but let’s think about it. If the $mil is opened, my next offer will be really low and I won’t make as much. If the $750k is opened, the next offer will probably be a little lower and I’d take that deal, because I’d rather take $450k than take a 50% chance to have $1mil (not based on stats, but based on how I could use that money). If the $1 is opened, I’ll definitely go home with more, though. In 2 of the 3 options, I’ll end up with less than that offer, even though there’s a 2/3 chance that my case holds more. Something that seems to be overlooked is the fact that you have to consider not only what the numbers will do, but also what you’ll do when faced with them.


Chris P. January 12, 2007

Jake — I actually link to the official version of the Flash game at the end of the post. Check it out for the American version of the show.


JasonM January 19, 2007

We have had DOND in Australia for a couple of years now but they are definitely not as generous as the American version.

Top case is worth $200,000 and a case holder with a correct guess receives $500 (it used to be $1000).


Mike S. January 21, 2007

Lots of interesting theories regarding when to “Deal” or “No Deal”. But to change the direction a little bit – does it bother anyone else that most of the contestants do not seem to have any system for selecting the cases? The $ amounts are randomly assigned to the cases and most contestants appear to “randomly” pick the cases, which seems to compound the problem. It would seem to make sense to apply some sort of logic to the selection process. Any thoughts?


Josh January 22, 2007


If the values are, in fact, randomly distributed (which I think we all believe is the truth), then a method for choosing a case doesn’t really matter. Deciding whether or not to deal can be based on a system, but randomly choosing from random cases is the same as systematically choosing from random cases. Having a method for choosing wouldn’t matter, as you can’t get more random than completely random.


Alex January 22, 2007

Mike S.,

People seem to show up without a clue, purely relying on “luck”. One thing we can be certain of is that the banker and probably Howie himself know the EXACT amount in each case. The banker makes an offer according to not just the mean of the remaining cases but also on the value of the case that the contestant picked to keep to his/herself in the very beginning. The amounts placed in the cases are NOT random at all, in fact, i think they have a system. Unfortunately, I can’t actually prove this, but i think that in certain shows there is a tendency for large (or low) amounts to be on even numbers and vice versa. I also have noticed that sometimes high (or low) amounts are placed side by side. Sometimes i fail to find a correlation at all. I’m certain that they have many more combinations that they have come up with. I think that they constantly keep switching b/w those combinations so the contestant won’t figure them out and that is the reason why it is very difficult as a contestant to determine the strategy that they are using. Also, at the beginning of each round the models are moved from their original positions and bunched up together. This is done to create an illusion. If the models stayed in their original position, then that would mean that there would be gaps and holes between numbers. The producers of the show want to prevent the contestant from noticing a trend, so therefore they move the models closer to make it seems as if in each round you are starting from scratch.


Nikki January 22, 2007

I have read almost all of the comments. I am no mathmetician nor engineer. I just love the show for the pure rush of it. The person that made the comment about IQ well go on feeling superior. I have no words for such nonsense. I think some of you are way too analytical. More importantly when you arrive at the game you have no money at stake. It isn’t your money until after you leave not before so calling someone an idiot because they didn’t take something is really quite cruel in my opinion. However, I do think some of you have missed a few points. Has anyone compared episodes to see whether or not the banker offers the same amount during a certain round if the value of the remaining cases happens to be the same from one contestant to another? If that is the case, then yes the banker uses a formula. If this is not the case then the Banker too uses a random formula for calculating how much he will offer and any given point. Also, most of you have not been paying attention to the show. I have seen many an episode where the contestant comes on with a specific dollar amount they want to leave with. I recall one particular 21 year old came on wanting enough money to buy herself a Lime Green Escalade. The banker offered her a Lime Green Escalade valued at approximately $83,000 and she took it despite her family and friends telling her no deal. Sorry I do not recall what values she had left on the board but since that was her goal she didn’t care about the money. Last night I watched a taped episode where the wife wanted to open a Animal shelter. She just needed a certain amount of money she didn’t specify but she went for it despite her husband telling her to take the deal her father told her to go for it she went with her father and lost because she opened up the last large amount on the board at that point. My point is you have to take into account what the personal goals of the contestants are as well. Most could care less about mathematics. They see a dollar figure and some are risk takers and others are not. Some have a specific dollar amount they want to leave with and won’t risk going further once they have gotten such an offer no matter if it looks good to them that they may indeed end up with the Million. Whereas others came to see if they could end up with the Million. During Premiere week here in the US there were Values that were higher than a Million to be had. I think if the Banker already knew how much was in the case he would never offer such large amounts. Therefore, I really don’t think he knows anymore than we do. I guess my point is don’t analyze the show so much just enjoy it for what it is a game of chance. Because none of us knows what is behind case #(?)


Alex January 22, 2007

Only a fool would think that the banker doesn’t know the exact amount in each case. By the way, who do you think puts the dollar amounts in the cases? A ghost? Or could it be magic? I don’t think so. There are a few instances in which the banker offers an unsual amount, but that is b/c that is also part of a strategy. People that play poker have to be good at bluffing. The banker on occasions likes to buff and toy with the contestant according to the math that he is doing as the game progresses.


Josh January 22, 2007

That’s a bold statement, Alex. It’s remarkable you’d make it without having anything to back it up. As far as you know, some tech in the back slaps values into cases and shoves them into the hands of the girls. You maintain that the banker knows where the different values are and adjusts his offer accordingly. I think you’re missing something.

When someone ends the game early, they do the whole… “let’s see what you would have earned”. If you notice, there’s no awkward delay as the banker knocks out some number. They immediately have the calculated offer. Now, I’m not saying that the banker doesn’t know and never manipulates this, but they clearly have some under-riding formula avialble. They might know where values are, but I seriously doubt there is some master plan to trick people into picking certain cases. Regardless, calling people foolish for not believing your utterly unfounded conspiracy theory is a little preposterous.


Josh January 22, 2007

And in response to: “Also, at the beginning of each round the models are moved from their original positions and bunched up together. This is done to create an illusion. If the models stayed in their original position, then that would mean that there would be gaps and holes between numbers. The producers of the show want to prevent the contestant from noticing a trend, so therefore they move the models closer to make it seems as if in each round you are starting from scratch. ”

… you do realize that the gap (patterns?) they would see would only be based on what cases they have already chosen. Are you trying to say that the producers are trying to hide the pattern from the people who produced the pattern? Sounds like foolishness.


Alex January 22, 2007


You don’t get it. The pattern is not established by the contestant, it is predetermined by the producers. It is up to the contestant to find the pattern. In order to prevent that from happening too easily, the models are moved closer together. Were in 2007 my friend. Computers are extremely fast. The math is done in a second(maybe even less) and bam you got the next offer. The neat thing about all this is that the computer has already calculated all the possible outcomes and when “let’s see what you would have earned” comes up, the math has already been calculated. You’re right, I don’t have evidence to back my statement up, however, the next time you watch the show try to keep track of the cases and their worth. I’m not saying there will be a pattern that we will all be able to figure out every single time but in some shows you might be surprised. It’s not a “conspiracy theory,” in fact, i think what they do is ok because the banker’s objective is to send the contestant home with as little money as possible. If they made it easy to win the million then the show would have gone out of business long ago.


Robbie January 22, 2007

I was playing the game by the numbers and kept saying no deal because the offer was well below the mean of the remaining cases and it turns out that i won a $1,000,000. So it just shows that Howie knows what he is saying when he says that it takes guts and timing. A player can have the guts but if they stay in too long their offer could go down and a player could have the million and take the offer because it sounds good.


Josh January 24, 2007


You said that the girls are moved close together after each round so contestants won’t be able to see a pattern. You also say this pattern is set up by the producers, so I assume you are referring to a “pattern” that determines where on stage different values are going to fall. Let’s just assume for a minute that there is some sort of pattern. How might a person figure it out? For most rounds, the contestant opens more than just one case at a time. If they wanted to find some sort of correlation between the case numbers and their values, they’d have to remember which value came from which case. If they are even capable of remembering this, why would they get confused my bunching up the girls? If figuring out the pattern requires remembering every case and its associated value, then it’d be just as easy to remember the original number of the case than the original location.

Now, more importantly, the fact that you are saying that they bunch the girls up is telling about what you believe the nature of this pattern to be. Since there is no reason for the producers to make a pattern and then mess it up on purpose, you must think the pattern falls not with the locations but with the case numbers. After all, if you bunched up the girls to hide your pattern when the pattern is based on the placement of the girls, then you no longer have a pattern and might as well have not created one in the first place. So I’ll trust you’re assuming a pattern associated with the numbers themselves. High-low neighboring numbers, all the high values being prime, that sort of thing. If that is true, then, again, the contestant must remember all of the case numbers and their values. If they are keeping track of the numbers, why would messing up the location change anything? If you tell me cup 1 holds the ball and then scramble up the cups, I’m still going to know that cup 1 holds the ball. Basically, if the pattern is based on numbers, then moving the girls would be useless. If the pattern is based on location, then moving the girls would eliminate the pattern.

About the computer thing. You made my point for me. You talk about the banker knowing where the values are and using deception and tricks to keep the people from making the big money, but then you admit that the banker isn’t actually the one that produces the offer. As you said, the computer calculates the offer. Perhaps you meant that the COMPUTER knows where all the different values are. Now, this doesn’t make much sense with your saying “By the way, who do you think puts the dollar amounts in the cases? A ghost? Or could it be magic?”, but saying that the banker knows doesn’t make a difference if you also say that the computer makes the offer. You really need to clarify who you think is actually picking the offer. It can’t be the banker, because the offers are produced too quickly at the end for him to be up there thinking “okay, so what would I probably offer her.” It’s obviously the computer that calculates the offers and you freely admitted this. So why does it matter if the banker knows where the values are? Even if the locations are known, they’d have to be worked into the computer program and then the banker is still irrelevent.

Finally, the fact that you keep finding different patters or no pattern at all shows that any such pattern is completely useless. If the pattern keeps changing and sometimes isn’t there, you know what that’s called? It’s called not having a pattern. How many combinations are possible that could be miscontstrued as a pattern? If you decide you think there’s a pattern, you’ll probably find one. Think about it! You say something like “sometimes they put high and low numbers next to eachother”. Isn’t it likely that it would happen by accident?! You’d have to make a concious effort to make sure that you DON’T do that. If you have a low value on the board, isn’t there a good chance there would be a high value next to it? And really, what’s the point anyway? Knowing a pattern is only good if you can predict upcoming values. Your “pattern” is always different and sometimes not even there. If you can’t extrapolate based on it, it’s not a pattern and it DEFINITELY isn’t useful.

So what’s this all mean? It means that there is no pattern. It means that if there were a pattern, it’d be so convoluted that you couldn’t use it. It means that if there were a pattern and you could use it, then “bunching up the girls” wouldn’t make it any harder to figure out. Sounds like there is a problem somewhere in your thinking.

And by the way, the goal of the banker is probably not to send the contestant home with as little money as possible. It’s more likely that the banker’s goal is the same as the show’s goal – to make sure the show earns as much as possible. I’ll bet ratings are more important than sending one person home with $10,000 less.


Mike S. January 24, 2007

Josh, I totally disagree that “randomly” selecting cases is the best that one can hope for. I think we would all agree that even though it’s possible that all of the cases could “randomly” line up in either ascending or descending order (Case 1 = $1,000,000; Case 2 = $750,000; Case 3 = $500,000; etc.), it is highly improbable that this would happen. This being the case, would it not make sense to try to systemize this knowledge to some degree?

Let’s say that a contestant opens Case 1 and it contains the $1,000,000 (bad luck, but get over it and move on). The odds of Case 2 holding the $750,000 are pretty low, so it would seem prudent to pick Case 2. Depending on the value of Case 2, you could then make some assumptions about Case 3, etc. It’s not a perfect system, but to me seems to make more sense than just picking numbers for no apparent reason. Comments?


Josh January 24, 2007

Assuming we’re at the start of the game, it is true that the chances that the $750k is in Case 2 are small. What are those chances? They are 1/25. What are the chances it’s in Case 3? 1/25. Case 4? 1/25. If you don’t know where values are, then every remaining value has an equal chance of being in every remaining case. Applying a selection system for a random set still produces random results.

I don’t know a good way to explain this because the concept is a fundamental. Try looking at it statistically. Unless you assume that there is some sort of pattern to where they put the values, you have to assume they are random, right? So if you open a case and find a value, it desn’t really tell you anything about any other cases. You can’t determine that the next case is high or low or medium. You can make no determination about other cases based on a case you just opened if they are random.

Consider another system of collection and random chance. I have a die and ask you to pick a number between 1 and 6. You do, and I try to roll your number. If I don’t get the right number, should you pick a different number the 2nd time? No. The dice have no memory. In a random system, previous events don’t give any insight into future events. Finding out that I didn’t roll the correct number the first time doesn’t tell you that I won’t roll that number the second time. Similarly, you can’t assume that the results of one case tell you anything about the values in other cases unless you presume to know some system, which only Alex presumes. If you’re saying that you don’t think the banker would put 2 high values together, then what you’re saying is fine (except I think the cases are random). If you’re saying that they are random but you should still apply a system, I don’t think it makes a difference. Use a system to approach an organized set of results, but when approaching random possibilities, the results will be random no matter what you do.

One last example… computers don’t really generate random numbers. Someone input a “random number table” which is basically a long list of basically random numbers. The computer then uses this to find “random” numbers for applications. If you had this long list of numbers, do you think it would make a difference if you looked at every number versus every 2nd number or every 3rd number? If it is truly random, your results will always be random.

Ok, I lied…. one more last example. If you walked up to a table and a man told you that a dollar coin is under one of the 100 cups and you can have it if you find it, would it matter which order you picked the cups up in? Does finding one empty tell you that the surrounding cups are less likely to be empty than those farther away? Bettering your adds after a round relies on you being able to learn something new about the system of cases. If there is no system, there is nothing new to learn, and thus nothing you can hope to achieve by picking particular cases.


Alex January 24, 2007


Obviously were at odds on this issue. You can prove yourself wrong by doing the following things before you watch the next DOND show:

1. Get a paper and pen.

2. Draw 26 squares, making sure that they are set up in the same way that the appear on the show.
(4 rows. 1st row from 1-6, 2nd row 7-13, 3rd row 14-20 and finally the 4th with 21-26)
Now label the boxes from #1-26 (leaving space to insert the $ amount at the bottom of the square as soon as they are revealed when the constestant picks it.)

3. At the bottom of the paper leave space for the case that the contestant selects to keep to his/herself, space to keep track of the offers and the number of the cases picked in every round.

You might be surprised to find that THERE ARE PATTERNS. The contestants just have no idea, no system and rely on luck. Yesterday’s show was interesting. EVERY SINGLE CASE IN THE FIRST ROW HAD $25,000 OR MORE(including the million). Every case was 25,000 or higher and as usual, the contestant failed to notice the pattern b/c she was not keeping track of the cases and their values. She eventually left with $83,000 which was a very good move on her part. Her case had $50,000. If she would have noticed the pattern, it would have helped her get better offers b/c the high amounts would’ve been left on the board.But then again, it depends on the person and if they’re thinking or caught up in the moment. There’s always the human element.


Josh January 25, 2007

Alright… maybe we’re having a terminology issue. If I were to flip a coin 5x and H-T-H-T-H, you could technically call this a “pattern” because the order is repeated. In a statistical sense, however, I don’t think this can be called a pattern. Why? Because in our context, you are saying that contestants should not only notice the “pattern”, but also act as though they can expect it to continue. Even though I got H-T-H-T-H, I still have only a 50% of getting Heads. The point I’m trying to make is not that you can’t find anything that looks like a pattern, but rather that the “pattern” is just random chance that your mind transforms into a pattern.

Think about stellar constellations. How many different “patterns” have been found in the stars? Big and Little dippers, warriors, bulls, and much more. You can point and say, “see! these fit together to make xxx, so it’s a pattern”. However, this does not mean that someone designed the stars just so those shapes would exist. Now, if you said, “I’m going to make perfect squards with the stars and you were then able to do so again and again with different areas, then you could assert some sort of real pattern. But if you simply said, I’m going to see if I can find a pattern in the stars and then found a dipper in one place, a bull in another, and a warrior in a third, then could you truly say you had found a pattern? Sure, your brain can extrapolate shapes out of what you see, but can it use those shapes to predetermine others shapes? No it can’t. The same is true of the coins. Although you recognize a “pattern” in the first 5 flips, it doesn’t help you determine what will happen next.

Again, I’m not saying you can’t find things that look like patterns. What I’m saying is that it would be difficult for you to NOT find a “pattern” if you were looking for one. Think about how many things might make you think that you’re seeing a pattern. And you’re not even using absolutes like “all of the top 6 cases being in the first row” but rather saying “the first row has 6 of the 10 cases above $25k”. You would call it a pattern if you saw 6 of the 10 lowest cases on a row, too. Or most of the high cases being even. Or most of the lower cases being next to large cases. And you say it’s always different. And sometimes not even there?! Would you call it a pattern if the 5 biggest cases and 1 low one were on one row? I’m telling you that the chances that you could find something that appears to be a pattern in any given round is very high. It would make sense, then, that most rounds have “patterns”.

Besides the fact that you think there is a pattern, you also think it was put their deliberately, which begs the question “WHY?” Are they trying to keep the game interesting for viewers? No, because then they’d just show the viewers the pattern and we can all wait to see if the contestant figures it out. Are they trying to make sure the contestant gets less money? No, because if they don’t know there is a pattern, they’ll just choose randomly, which means the purpose would be defeated. Are they trying to make sure the contestant gets more money? No, because the pattern would have to be useful to a contestant, which these clearly are not. Why make a pattern if it will have no impact on the game?


JK January 25, 2007
This guy created an “Ultimate Decision Maker.” It’s somewhat accurate. Obviously the banker doesn’t follow a set pattern in his offers with respect to the expected return. But this excel program gets you pretty close to the actual offers.


David B. January 28, 2007

“Risk aside, accepting a “deal” for less than the mean should generally be regarded as a gutless, weak decision, and the contestant should be ridiculed accordingly. ”

But it can’t be “risk aside.” Risk is a real part of the game and has to enter into the analysis. Everyday people don’t make purely mean-driven decisions. And it’s not because they’re irrational — people have risk aversion, and standard decision theory concepts (e.g., utility theory) model these concepts quite well.

Do you buy insurance at all? Ridiculing someone here for settling for less than the mean would imply you should ridicule anyone who buys insurance as well.

Suggesting that expected values are the end-all, be-all of the story here is a gross misrepresentation of reality, and, frankly, rational decision-making.


Steve January 29, 2007

Thought I would share some of my chicken-scratch calculations. I decided to ask “assuming someone plays straight through, what is their probability of eliminating all of the top 7 amounts in round X? What about their probability of eliminating the top 3 amounts?”

You cannot eliminate all 7 cases in Round 1 (because you only pick 6); but I got that the probability, to the nearest tenth of a percent, of eliminating all 7 six-figure cases after a particular subsequent round is as follows: (r2) 0.05%, (r3) 1%, (r4) 4.8%, (r5) 11.8%, (r6) 17.7%, (r7) 25.9%, (r8) 37.2%, (r9) 52.6%

Formula: 19C(n-7) / 26Cn , where n is the total cases selected and xCy = x!/(y! * (x-y)!) Reasoning: you have 26Cn ways to select n cases. If you, for example, have selected 11 cases, if the top 7 are all among those that leaves 4 others you have to pick. There are 19 other cases you could pick those four from, so there are 19C4 ways to eliminate all 7 cases when you pick 11 from the board that include those 7. Those 19C4 ways of picking a set that includes all 7 top cases are divided by the total combinations of 11 cases to get the probability of ‘total annihilation.’

I thought this got an interesting result: that only a little more than half the time you play through will you eliminate all of the top 7 cases. Intuitively, I thought it would be closer to 75% because 7/26 ~ 27% — but then again, I was always really bad in probability class. :)

The chances I got for knocking out just the top 3 amounts, using the formula 23C(n-3) / 26Cn:

(r1) 0.8%, (r2) 6.4%, (r3) 17.5%, (r4) 31.2%, (r5) 43.9%, (r6) 51.5%, (r7) 59.23%, (r8) 68.12%, (r9) 77.9%.

A tougher question I am trying to figure out: how many of the “top dollar” cases can one expect to see still on the board after round X (for both the top-3 and top-7 varieties).

Also, if someone wants to answer that last question, or verify my math (it’s been a while!) that’d be spiffy. :D


Jarod February 5, 2007

I had 2 cases left on the webgame, and it was 50,000 and the 1,000,000 …. i was offered 340,000 to stop. .. I considered it… the probabilty to be the first person to win the whole million, + the fact that 50,000 woudl still clear all my depts if I was to win it in real money…. and those two combined made me decide to open my case… over the 340,000…… I became a millionaire. :D

BTW: noticed a flaw… one time I played, and I had the 400,000 and the 300,000 cases left… and it offered e 248,000 to stop playing… I was thinking WTF?… no way Id settle for less then on the board.


koby February 7, 2007

I want to know how the banker offers the deal. And how you know if you should take the deal. This is for a school project so if any knows, help me!


Marcia Schandevel February 8, 2007

Has anyone considered leveling the playing field for actually winning the $1,000,000 by alternating the million in a case with even numbers then odd numbers with each new contestant? As it is, you will probably never have to give up a million dollars. Is that the plan?


John Hill February 8, 2007

“Risk aside, accepting a “deal” for less than the mean should generally be regarded as a gutless, weak decision, and the contestant should be ridiculed accordingly. ”

The Banker almost always offers less than the mean, and he is able to do this because he has a very significant advantage over the players: for the Banker, the element of luck is entirely removed – because he plays the game again and again the luck evens out. So for the Banker it is a game of pure skill.

The players, on the other hand, only play once, and for them the randomness and uncertainty of a single game represent a huge disadvatage (in the UK game players’ winnings are about 30% less than the the average amount in the case).

So the players will almost always be in the position of recieving offers less than the mean, and this is a consequence of the structure of the game rather than any ‘gutlessness or weakness’. In popular parlance you would say the Banker is able to ‘play the percentages’ whereas the players are not.

koby: The Banker should use a combination of theories including Utility Theory and Game Theory, both of which are college level (but Game Theory is quite easy to understand). You can quote this article if you like, which has some background information and simple statistics.



Tammy Seitter February 11, 2007

I would like to tell you about a conversation that me and my 3 year old granddaughter had on 2/5/07, while we were watching Deal or No Deal. A commercial came on and she said to me, “Maw Maw I am going to be on TV when I get older.” I said to her ” Really?” She said, “Yes, he is going to say, McKenzie, Open the case”, I said ” so you are going to be one of Howie’s girls?” She said “Yes number two.” I told her, “that was a good goal in life to have.” She just turned 3 in November.


Gene February 13, 2007

Does anyone know how many calls Deal or No Deal receives during the text messaging portion of the show. I e-mailed NBC and they e-mailed no resonse. Thanks a lot!


Douglas February 13, 2007

how do you go about getting on the show?


Hans from Germany February 14, 2007

That’s a pretty neat article. Covers the expected value approach, but also considers the role of the banker (it’s a 2-person game, with cost and payoffs for both), as well as alternative measures of central tendency, especially the median (what’s the value of cases in the middle of the range 1-26?).

Also, has some limited data as to when the banker starts offering payouts in excess of the expected value.

Thanks for providing this entertaining discussion!


Charlie V February 18, 2007

“The Banker almost always offers less than the mean, and he is able to do this because he has a very significant advantage over the players: for the Banker, the element of luck is entirely removed – because he plays the game again and again the luck evens out. So for the Banker it is a game of pure skill.”

Poppycock! And it is a conventional wisdom bit of poppycock. For your assertion to be true, “luck” has to know what happened last game, what will happen this game, and what will happen next game. “Luck” is, well, dumb

Any student of statistics 101 will tell you that each set of variables can only be evaluated on that specific incident. Case in point:

You are flipping a quarter. You just flipped heads 4 times in a row. What is the probability that the next flip will be heads? It is 50/50. No matter how many times you flip a coin and regardless of what the previous result, the previous ten results, or the previous 100 results, the next flip is the a 50/50 chance.


Bill February 18, 2007

I think all of you are missing the most important part of the game to the producers. The goal of the game is to maximize the number of viewers thereby keeping the game on the air and earning money.

I’m going to digress a moment. Have you ever wondered why all of the shows are adding a text-in and win portion. They offer $10K and charge $1.00 per message. That alone probably pays for the whole show………….

Back to the game. I suspect that when they get a vivacious player that the banker offers less than the average to keep the game going. When the player is a dud they offer more to get the player off the air so they can get to the next person.

This game is about the dollars that go into the producers vaults nothing more or less. They want to keep it on the air. That means they need to curry viewers.

I submit that the banker offers directly reflect the entertainment value of the contestant.


Bryan February 19, 2007

I didn’t read through all the posts on here because there are so many, but can someone explain to me why the Monty Hall Problem does not apply to Deal or No Deal. The Monty Hall Problem is complex and rather than try and explain it here, if anyone is interested just do a google search for Monty Hall Problem. In its simplest form the Monty Hall Problem says that the contestant has a 1 in 26 chance of winning the million dollars. Even if 2 cases are left with one containing 1 million dollars, the contestant’s odds of winning the million are still 1 in 26. I know it makes no sense. If anyone has the answer I would like to hear it.


gary February 20, 2007

I am also in your situation — I just found this site and have only skimmed its contents. However, the very brief answer to your question is that in the Monty Hall problem the door is not opened at random. If it were, then the odds of the contestant sticking with their original choice and winning the car would be 1/2 after one door is opened, not the 1/3 chance that it was and still is. I also wondered about this when I first saw the show, but the bottom line is that the cases are all opened at random, and if there are only $0.01 and $1,000,000 cases left, the odds that the contestant has the million is 1/2, not 1/26.

And as an side, I don’t think Howie knows where the money is, but the banker probably (but not necessarily) does. I think I heard at the beginning of a show that the money is distributed by a third party, whatever that means. The biggest question I have is after the deal is done, Howie asks what their next choice would have been, and then shows what the bank offer would have been. That implies that there is a definite algorithm to determine the offers, while I always thought that the banker made offers to “work” the contestant. If the offers go down as the game progresses, this would encourage the contestant to keep going and prolong the game because there wouldn’t be as much to lose.


Lena Bellon February 25, 2007

I think the show is great for people to actually dream about large amounts of money.I’d sure love to get on the show just to see if it really goes down as it looks on our exciting Monday,or Wednesday nights.I know our family watches it faithfully.


Bob Munson February 26, 2007

A probability question from college might be of interest here. What should be the strategy of a College graduate accepting a job offer? Say the person could expect 10 offers and had to decide on each one before seeing the next offer. The person also had no preconceived target salary.

The answer was to turn down the first X offers and then to accept the first subsequent offer that was better than the best of the first X.

This applies to Deal or No Deal in that the sequence of offers are a random walk. One should expect that in the course of that random walk the offers had a good chance of at least once being higher than the presumed offer at the beginning.


Darcy February 26, 2007

Doesn’t the fact that the player selected case is essentially “out of play” for the entire game until there is but one case with the lady and one with the player? Doesn’t the player, when there are three cases from which to chose and one that they have already selected, really only have a 33% chance of selection rather than a 25% chance? There are only three to choose, not four …. someone please explain!


Lena Bellon March 1, 2007

I’m submitting my entry and Howie,I plan on taking the Bankers money.I have 50 Children,21 which are Great Grandchildren and they need help to go to College.I am one determined GrandMother,As you will see when you meet me. I sure look forward to meeting you and the Banker.Lena Bellon


Jason F March 5, 2007

It almost seems that a good way to play the game would be to note that the average winnings are $131,477.54 and play the game solely on the basis of “If I ever get an offer higher than that, I accept it, and if not, I keep going until the end.” 131K is obviously the average if everyone blindly rejects every offer, thus relying only on the value of the case they originally picked. If at any point that value is higher than 132K or so, you’ve come out ahead of that. And honestly, 132K is nothing to scoff at.

On the other hand, sometimes it is better to take it on a per-offer basis. The last show I watched had the woman rejecting each offer (each of which were below the mean until the one she accepted), until accepted the 403K offer, when all she had left in play were the $400, $400K, and $750K, which averaged less than $400K anyway. In my opinion the woman played perfectly to the end and was rewarded for it, especiallly when she was holding the $400 case.

I honestly believe that there is true randomness in the game in terms of where everything is placed, as well as doubting that the girls, nor the banker, know any of the amount locations.

The theory that I came up with last night is that the banker gives an offer that has a percentage multiplied by the mean that goes up as the number of cases decrease. That would explain how the offers can keep going up in general, only declining when the largest amounts are removed. After all, when the offer in the show I mentioned was 180K and she removes the 300K with only 5-6 other cases remaining, including the 400K and 750k, how does the offer only decline to 175K? Each time the banker’s offer goes closer and closer to the mean, perhaps going above it when there are very few cases remaining. I do give the banker some say in it, but I find it hard for him to be really “beating” the contestant without adding the drama and rush to the show that makes it worth airing. I believe his goal is to keep them there as long as possible without going all the way.


John Hill March 7, 2007

“That would explain how the offers can keep going up in general..”

A similar idea to the ‘time value’ of derivatives.


Nareg March 7, 2007

Hey i was wondering how do you know what the bank offer is going to be is there an eqasion?
im doing a project on this and i was wondering if you could help.


Nareg March 7, 2007

Hey i was wondering how do you know what the bank offer is going to be is there an eqasion?
im doing a project on this and i was wondering if you could help..


JRS March 11, 2007

The formula for calculating the offer is relatively easy b ut if you are only looking at the mean you will never figure it out. Hint it has to do with the remaining statistical values and num.ber of cases remaining along with picks required. I have run the formula and other then rounding up to the nearest 1000 it is right on


Nikki March 14, 2007

This message board has some of the most cynical people I think I have ever come in contact with. With the exception of my co worker (I think he takest he cake). Why is it so impossible to think that a 3rd party company would randomly place the values in each case and it is not the Banker? Once the values are placed then the contestants randomly choose a case. This game is so simple it is not funny. As for the amount that is offered it does depend on what cases were opened and which ones still remain on the boards. So I am sure it is more about statistics than it is about formulas. Besides the producer of the show said that in an interview. The banker’s offers are based on a propability of you getting the higher case in this round. The banker’s sole goal is to make you (the Constestant) leave with as little money as possible. I would agree with the formulas if I hadn’t watched almost all the eppys and realized the show is sometimes customized for the contestant. If the banker knows there is something specific the contestant wants depending on how the contestant does by a certain round the offer can include the item that the person wants such as: Cars, Tickets for specific games etc… When this happens the value sometimes is well below what the person has on the board. But the temptation is now higher for them to take the deal which is the sole reason the Banker exists.


Jessie March 17, 2007

My friend and I played the online game last night for the first time…and surprisingly we had the $1,000,000 case! It was acctually really interesting because we managed to keep the $0.01 in the rest of the time and we went to the very end! We were screaming and stuff and then we relized that we hadn’t acctually won any money…too bad. :(


Treetop March 22, 2007

Playing on-line last night, in one game all I had left was $400,000 and $500,000. It offered me less than $400,000, which seemed downright silly.

Gee, what should I do? Take a guaranteed $350,000 or a 50/50 chance of $400,000 0r $500,000?

I’m curious, JRS. Do the TV show and on-line game use the same algorithm?


Kent March 25, 2007

Did anyone notice how beautiful the girls are?


Alex Irvin March 27, 2007

Hello All,
Ok – I’ve got a question for anyone who is good at statistics, probabilites, etc.

We play the online version of this game for real money – with one person being the banker and one person being the contestant. We basically took the average of all the cases ($131,000) and took off four digits to come up with an entry fee of $131.

So the contestant pays this $131 and then selects case after case until the banker offers deals. The contestant can take the offer or not, just like the show. So if the offer is $155,000, the contestant can take the deal and make $24.

Make sense?

The question is – Is that $131 a fair amount? We’ve read that if you go all the way through to the end, on average, the case value will be $131,000, but isn’t the problem that the banker makes deals before the end – so you can’t just use the average for a fair entry fee?

Basically, we’re looking for a fair entry fee that if you played the game a million times – the banker and the contestant would break even.

Any thoughts other than “You guys are compulsive gamblers who need help” ?



Nareg March 29, 2007

JRS i dont get what you mean. I tried the equation and well it did not work. I need help and im hoping someone can give it to me. I tried it with on of the shows and it did not work so someone please explain it to me..


Mark April 2, 2007


It is possible to make this formula in Excel? How?

Can i use this formula in other programs(Flash,Java)?



Lena Bellon April 2, 2007

I want to get on Deal orNo Deal so bad,evenif I can’t act crazy at my age,but if I did get some good offer’s I’m sure I’ll get excited even at 71yrs young.


Max L April 8, 2007

Hey. I just watched the show and I am really confused at how the banker gets his value. Somebody posted that there is a formula, but this person did not post the formula. If you would be so nice as to posting this formula, that would be great. Thanks.


Gene April 11, 2007

A rough Excel version based on discussions above can be found here.

Any corrections/redactions/suggestions welcome!


Alex Thorn April 14, 2007

I have not put a significant amount of thought into actually calculating the expected value of DoND, though I can assure everybody that it is greater than $131,477.54, assuming that it is ever optimal for a rational risk-neutral contestant to accept a deal (as it seems to sometimes be, and which appears to be generally accepted within the postings on this site).

Consider a game where the contestant is never given the option to “make a deal,” and instead is forced to accept the value of the case which was initially chosen. Here, the expected value would indeed be $131,477.54, the average of all the cases. However, by giving the contestants the option to accept a deal at some point within the course of the game OR accept the value of the initially chosen case at the end, the expected value would necessarily (within the confines of my previous assumption) be increased.

As far as I can tell, a rational risk-neutral player would only accept an offer when it is greater than the sum of the average value of the remaining cases combined with the expected “value” of the remaining decisions.

Calculating the expected value of the remaining decisions is difficult, as it depends upon the strategy that the “Banker” uses in determining the deal value. This strategy that the Banker uses seems to be strongly dependent upon the average value of the remaining cases and the number of cases left to open. Other factors that could potentially play into these deal values that would be endogenous to the game could be previously offered deals, some derivative of the average values remaining (e.g., if the average value is higher, the deal value may be a lower percentage of the average value than if the average value were lower, all else held equal), etc. As nobody has posted an exact algorithm for calculating this, it is even possible that the strategy includes some amount of randomness/discretion. However, assuming that the strategy of the “Banker” does not change from contestant to contestant, the randomness/discretion could be measured.

Theoretically, all of this could be combined together to create an optimal strategy for the rational risk-neutral player, and an expected value for DoND could be determined. The calculation would only have to be based on rounds where the Banker might offer more than the average value of the remaining cases (as many have noted, the Banker rarely offers anywhere close to 100% of the average value in the early rounds). Saying that we did have an accurate algorithm and that the Banker would never offer more than 100% until there are 5 or less cases left (and that the deal values are only based upon what is remaining rather than the order in which cases were previously chosen), there would be 328,900 possibilities (26 nCr 5 multiplied by 5, since one of the five cases holds the initially chosen case) for remaining cases. For each possibility, the expected value of the sub-game could be calculated, and the average of the expected values for these 328,900 possibilities would yield the expected value of DoND. The calculation of the expected value of this sub-game would not be difficult, as a tree-like structure of decisions and results could be created having a total of 24 nodes (4 x 3 x 2).

Thus, it is computationally possible, assuming we had an accurate algorithm for predicting deal values, to find the expected value of playing DoND. Unfortunately, figuring out this algorithm that the Banker uses in determining deal values has proved to be very difficult.

Does anybody know of a site that posts the results of DoND, so as to save the inquisitive of us interested in deducing the Banker’s strategy the hassle of actually watching every show? It is interesting to watch, but gets redundant after awhile as its slow-paced nature becomes unbearable.


Anonimous April 14, 2007

How much is $100,000 worth?

If you are Bill Gates it’s chump change. However if you compare this amount to the average salary in the USA of about $40,000. a decent car costing about $25,000 or the average house of $150,000 then the $100,000 number takes a different dimension.

Also, let’s say that a lucky DOND contestant by the middle of the game has removed ALL the of the low-value briefcases from the game ranging from $0.01 to $750.00. The expected value of this combination is $262,000, which we would expect the banker to offer to the contestant.

Will the contestant choose $262,000? This amount represents about 6-years of salary, 10 decent cars, or paying off his mortgage and still have $100,000 left in cash (assuming no taxes are in the contestant mind’s at the time of the decision).

And keep in mind that this is one VERY LUCKY contestant that has taken ALL The low-value briefcases out of the game. If I was the executive producer of the game, I would expect that most of the time the final award will range between $100,000 and $260,000 because people are people and make decisions based on THEIR expected value.

By asking for the contestant’s occupation and making sure that the game does not include stock brokers, CEOs, or mathematicians, the producers of DOND are able to keep the awards down.

Your thoughts?


Rich April 16, 2007

It is easy to make an excell file which solves Deal or No Deal solutions. It took me 10 minutes to do…check it out.


Nellie April 17, 2007

This is basically just a comment to Chris, the author – I got so darn distracted by all the arguing in the other comments that I can’t respond to them –

Have you ever heard of risk aversion / risk neutrality? It’s an economic concept where the “player’s” attitude toward uncertainty affects the amount they would trade their decision-problem for, or what they would accept as an offer. This amount – the offer – is the certain equivalent, and the mean IS the CE for a risk-neutral person.

The banker probably assumes that the contestant is risk-averse, and that they would rather have a guaranteed payment than a “shot” at a larger number. So you use a different function to find the CE.

(Most of us are probably risk-averse, and interestingly, I don’t really think the math is defined for people who are “risk-favorable” – from what my class discussed, it just seems to be undefined. That’s how some of the people on Deal or No Deal seem to act, though!)

Anyway, the square root function represents risk aversion, and the exponential function is constant absolute risk aversion. Some risk aversion functions can depend on wealth, and there are factors that are adjusted in each of them (especially the exponential) to suit each person. You have to solve for CE. It’s pretty cool stuff. It also explains why the bankers offers are so far below the mean in the beginning – he’s assuming some function of risk aversion for them – that they’d pay a “risk premium” to avoid uncertainty.

I was actually fascinated by your discussion because it’s basically the same idea – and for a person playing the game, yours is WAY more practical. Very interesting! Great post (even though it’s old)! Definitely belongs on your “must-read” list. ;-)


Math/Stat/Econ Major May 5, 2007

Whether or not you should accept the offer should not be a matter of whether the offer is greater than the expected cash payment. Rather, the utility of the offer should be greater than the expected utility if you were to continue playing the game. Furthermore, utility functions will be different for each person who may play the game. I recommend you look up the concept of expected utility in an economics text. It is closely related to the reason people buy insurance. You pay more than the expected loss amount for insurance because you are risk averse.


ME May 8, 2007

I think you are right in most cases but for example in there where just the 1,000,000 and the 100 left and the offer was 400,000 would you take it?
the offer is 100,000 less then the mean but I would still take the offer.

I would take any offer over 200,000 If I am missing a “safety net”, that is; If I pick the highest case am I ****ed?


Jason May 8, 2007

Guess what, guys? I’ve got a good chance to be on DOND, as a supporter, that is. My mother-in-law got through the first two rounds of interviews and we’re awaiting a call back from the producer. I’m actually a math teacher myself, so the probability analysis makes sense. Other than bringing a graphing calculator (probably against the rules), what do you think I should take to the game with me? I just started looking at various blogs today, we found out about our good luck yesterday.

Any ideas? Thanks.


bitguru May 10, 2007

There are too many comments here for me to read them all, so sorry if I’m duplicating.

Anyway, in all the times I’ve watched the show (which is only a half-dozen times, half of which were the spanish-language version) I’ve never seen the banker offer more than the mean.

This makes sense, I guess, because its in the banker’s best interest (or at least the show’s) to keep the tension going. When I put myself in the contestant’s shoes, I find myself frustrated that the banker doesn’t offer me better deals, but eventually I decide that I must accept one of the bad deals (based on my personal utility function).

So I’m not sure what was going on that the banker did offer more than the mean in your example. I agree that the player should probably have accepted the deal, but let’s presume she continues. There is a n 20% chance the next offer will be much lower. (In this case $21K is surprisingly generous, but I guess the show has two motivations: [1] get the next contestant on the stage because the audience doesn’t no longer care’s how much she will win, and [2] they have a vested interest in not coming off as mean to their contestants.) But in the 80% chance the offer will increase, how much will it increase by? The mean will go up by only 25% or so (from ~70K to ~88K). I wouldn’t expect the banker to continue to be generous, but what do I know?


The Big Booper May 12, 2007

When I watch DOND, I ballpark the % of the mean by just adding up in my head the value of the largest cases still in play. It is obvious that the % value keeps climbing the deeper into the game the contestant goes. The only times the % drops is when the player has a bad round, knocking out two or more of the high value cases. I have seen a couple of games where the offer is larger than the arithmetic mean, usually towards the end when there are 3 or 4 cases left.

It would be interesting to record all the shows and then go back and see if the random number generator does follow patterns. The contestant would then take the first 2 rounds to try to see if the opened cases followed a past pattern. It would be up to the rooters section to track this since the contestant is “on” and there is too much going on to concentrate on establishing a correlation to historic spreads of the case values.


themetal May 14, 2007

I’ve been watching the show for a while now. While I do believe there is math involved in the later rounds, the first few rounds the banker truly just throws a random number out there.
It became apparent after I saw three different shows with fairly similar openings (bunch of low amounts gone, and the 400k case gone) – bankers offers in all three scenarios ranged from a whopping 41K to a ridiculous 9K (I am fairly certain the other show was a 23k offer, but not too clear on that.)

Based on what I’ve seen, you can’t make a reasonable banker’s offer estimate mathematically for the first two rounds. (offer Math.rnd() )

After that, the easy calculation for coming close to the banker’s offer isn’t by working with numbers you have, it’s by working with the two numbers you will have in the best and worst case scenario (mean of value after x lowest cases are eliminated, and mean of value after x highest cases are eliminated) — that formula will work up until there are 4 cases left (including the one you’re holding), once you’re at that point, it seems the closest formula is mean+(random(1% to 20% of mean), depending on sobstory of contestant)

Anyway, just some observations to keep in mind that there might be 3 different formulas used in the game, rather than the 1 perfect one everyone is looking for…


Rob May 17, 2007

a couple notes.

first, the banker offer in each round is strongly correlated with the myopic expected value of the remaining cases… in general, the r-square is about 90% (which means 90% the variance in the dependent variable – offer – is explained by the variance in the independent variable – expected value). the ratio of offer / ev does climb substantially in later rounds, and is frequently > 1 in round 9.

second, the accurate calculation of the expected value of any “no deal” decision is, in fact, far more complex than the simple average of the unopened briefcases. whereas the “deal” decision ends the game, the “no deal” decision continues the game. thus, the value of the “no deal” decision must include the expected value of ALL FUTURE deal/no deal decsions. this means the true expected value of each no deal decision must include the value of all possible paths through all possible remaining rounds. naturally, this requires some estimation of future offers.

finally, although making a deal/no deal decision based on risk-neutral expected value is optimal over a large number of games, it is not necessarily the best way for each individual contestant to make their decisions. most people are risk-averse, which means that they will accept an offer that is lower than the mean (however you calculate it) of the remaining boxes. this is not irrational, and not necessarily suboptimal: depending on attitudes toward money, starting wealth, and the like, some people are far better off accepting $50,000 than they are accepting a 50/50 chance to win $100 or $200,000. this type of decision theory is generally modeled using risk-utility functions.

i hope i haven’t repeated too much of what’s already been posted.


Lauren May 23, 2007

I’ve been trying to simplify the “deal or no deal” game to illustrate some basic conditional probability concepts. Suppose there are only six cases and only one is a winner. You choose a case as “yours” in the beginning of the game. Then you randomly select from the other five cases to open. Suppose also, as many of the actual contestants are, that you are driven to continue to open cases until you bust or are down to two cases. So, you either lose or get to choose from the last two cases, yours and the one remaining. Is this similar to the monty hall problem? Are you better off to switch? What are the associated probabilities?


Harris Deitch June 3, 2007

Saw the show recently with the NYC subway hero. He was done to 3 cases: 1,000,000 & 20,000 & a small case. The offer was 305k. I heard him say to his sister that should he pick anything but the mil, the next offer would be around 500K. Correct. However, he did not do the math. He was risking over 3ook to win another 200k. A bad bet. He should have takent the 305. Instead he went one more case, picked the mil and wound up with zip but Chrysler gave him a new Jeep. No one does the math. It’s always one more case, then one more. Or the contestant deserves so much more. Or pay for my house, car, school and everything else. Greed sinks in. If there is only one big case left, take the offer and get out of there. If there are two big cases left, but one is much higher, take the offer and get out of there. The game is not what’s in your case, but what’s on the board and the offers.


Anonymous pussy who didn't leave an email address June 8, 2007

This was one of the more entertaining comment threads I’ve read in a while. Of course, it also became one of the more personally offensive I’ve read. I’d offer my ideas on the show, but since my degree in computer science is only from a state school, it couldn’t possibly have any value.

For what it’s worth (which isn’t much because I didn’t pay 30K/yr to go to school), I don’t think either Josh or Chris are 100% correct, in that neither of them provided a bulletproof theory.

Chris has argued a few different things about the mean (always take over, never take under, etc), but at one point boiled his argument down (in bold) to “never take an offer under the mean”. Of course a few comments later he said he would do just then in a certain case… So you can see, for all his fire and brimstone, he’s not even convinced himself he’s correct.

Josh seems to be considering only the probability that the offer will go up in the next round. This is a simple theory, and easy to apply on the fly. Unfortunately it doesn’t seem to consider the risk/reward; that is, how much is the offer likely to go up versus how much do I stand to lose in the worst case scenario?

While I haven’t seen a precise mathematical formula from Blake, he seems to be the only one taking into account the mean, the probability of an increasing offer, AND the opportunity cost. If I ever go on this show, I’ll be looking to take Blake with me as one of the 3 people on stage.


Mozart June 9, 2007

Thnx for the great statistics… it really helped me while doing my project on Deal or No Deal for probability.


jo July 10, 2007

so i can bet the banker


pDale Campbell July 20, 2007

‘Tis a beautiful thing! I found this page while looking for the real banker’s formula. It is great because, assuming a strategy that can be stated by hard-and-fast rules (anything else is really just relying on luck):

(1) Someone who actually knows statistics (unlike me and most of the people commenting) could prove which strategy is optimal.

(2) Someone who can program (yep, that’s me) could “statistically” prove which strategy is optimal.

(3) In the end, statistics should not be 100% of the decision for someone on the show because you only get to play once (unless you go home with only $1 ;) ).


Dave July 26, 2007

I have a question:
do I need a different approach when I have to open more than 1 suitcase if I don’t accept the deal
(like earlier in the game)?


Caligari August 16, 2007

there is much more here than just the simple mean analysis offered by the author. you should check the risk theory and utility functions to learn more about it.

there even exists a neat counter-example why mean value is sometimes completely wrong:


Adam September 17, 2007

lol, after seeing this, I went to play the DOND game, and just of sheer luck, I picked the million dollar case off the start


james October 11, 2007

Your math is wrong, take a game theory class.


Matt October 19, 2007

Interesting post. I’ve been wondering if there’s a significant probability that two values close to each other on the prize scale are not adjacent to each other in the suitcase number layout. For example, if you knock out the 750K on suitcase 5, and you accept the producers’ assertion that the values are randomly situated, is there a higher level of probability that you can pick suitcase 4 or 6 without knocking out the 500K or 1M?


blake October 19, 2007

well, i haven’t watched the show in a while, so i watched tonight. it was interesting because it was another test my above strategy…it would have been a little hard for me to stick to my rules tonight, i thought, you know, greed kicks in. here was her situation when i turned it on. she was being offered 80k, she had to remove 3 cases if she said no deal. and 1 mil, 75ok, and 200k were still up and i think 25 was the next and then a bunch on the left side of the board. i said would be hard to walk away with the chance to make a whole lot, but if i want to make sure i walk with a lot..i gotta stick to my rule and take the deal and walk away. well, she listened to bad advice and her picks revealed 25k which she was assured wasn’t bad, then bam 1 million, and before she could recover, bam 750k. and then the offer was 18k, and she to play the odds a couple rounds and hope the 200k didn’t come. she worked back up to 49k til she couldn’t stomach it anymore. the point is your strategy has to be valid for each point a choice has to be made. in this game the critical choice was when she continued to play knowing she had to remove 3 cases when she only had 2 very high values left. i know some of you will still say, “well, for everytime this happens, someone else does the same thing and walks with a huge amounts. and that would be great if they all shared the winnings (law of averages, huh?) but they don’t. luck sets the ceiling, the strategy is that you walk with the “guaranteed”, for sure, highest amount, beyond that you are just gambling that highest safe amount against lady luck. seems my strategy is still holding up quite well.


Leo October 26, 2007

Now, don’t jump all over me for writing this… At the start of the game the contestant picks 1 case out of 26. No matter what is revealed along the way, the odds of having 1 mil in the chosen case never changes. It remains 1 in 26. So, if towards the end there are 4 cases left, Howie will LIE to the player by saying, There is a 25% that your case holds 1 mil!” WRONG HOWIE – it’s 1 in 26. When a player gets down to the last few cases, they should weigh THAT against the offer and the value of the remaining cases.
BTW – The Monty Hall Problem [mentioned in a post above] does not apply here, since Howie has no knowledge of what each case holds, nor does he alter his “deal” because of “what’s left.”
Oh.. one more thought. It becomes harder and harder to resist hitting the DEAL button as cases are eliminated, but let’s face it, if you’re going for the mil, you will have to open ALL the other big money cases without being scared-off.


andrew October 30, 2007

that is COMPLETELY ABSURD. there is not a very big chance that you’ll have 1 million among four cases, but if you don’t, you won’t be making that decision! so when it DOES happen, you DO have a 25% chance of having it.

i’m telling you right now, that i have steadily refined the statistics of this game and i have a full system that explains when you should turn chicken and when you shouldn’t, depending on the numbers, the cases, your feelings about the cases, and the realities of winning that money in life.


andrew November 2, 2007

see, the major problem with that analysis is: there may be only a 1/26 chance that it was in there to start, but if it’s NOT in there, then it’s in the last case in the gallery – and you have a 1/26 chance of leaving that one until last! so they both cancel out and that leaves you with a 50-50 chance of having the bigger number.


leo November 2, 2007

I “take-back” my “1 in 26” comment. Sure the game starts with a 1/26 shot at the mil$, but along the journey, 24/26 of the contestants won’t “make it” to the end 2. The VERY lucky one who does indeed will have a 50-50 shot. No wonder I almost flunked Probability & Statistics!


Brian November 9, 2007

The easiest rule of thumb I think to follow as far as when to take the deal is generally:

“Take the banker’s offer if it is larger than the second biggest case still in play as long as you still have two or more “big cases” in play”.

So as long as you have at least two big cases still in play, you still have a safety net if you’re only opening one case at a time so you should keep going.

If you’re lucky enough to get to the end with only cases that have big amounts, then you’ve really got nothing to lose. You’re going home with a big number no matter what.

But if you ever get down to only one big case and several smaller cases, yes your odds are good of picking one of the smaller ones and increasing the banker’s offer the next time, but if you pick the big one you are screwed.

So I always look at the second biggest case still in play and if the banker’s offer is over that, then the only chance you’ll beat that number is by picking a case besides those top two and I don’t think the odds of trying to get that top case or increase the banker’s offer is worth the risk.

So many people seem to play until they’ve only got one big case left and then they face the dilemma of “just one more” and never know when to stop.

I say stop while you’ve still got two big cases left if the banker’s offer is above your second biggest case.

No statistical analysis to back up this theory, but as someone who watches the show a fair amount, it seems to hold true.

So many people start well initially and get some big banker’s offers, but they take it too far, open some big cases, and then the offers start going down instead of up and they keep trying to get back up to those big offers and rarely make it. Like a gambler trying to win back his losses by increasing his bets.

Yes some will do better by being aggressive. And some will do much worse.

With this rule of thumb, the only way to get big money is to be lucky enough to keep at least two big amounts fairly late into the game, but so many times I see someone with only one big case left open “just one more case” to try to get a bigger banker offer and those people seem to lose more often than win.


andrew b November 10, 2007

you do know, that by following this credo, you have a 0.074% chance (1/26 that $1M is one of the last three * 3/26 that $400K or more is one of the last three (so that your offer won’t exceed it) * 1/3 that you’ll remove the smaller amount * 1/2 that you have the million) of winning $1M!



leo November 10, 2007

to andrew b…
1- If I had 0.074% of winning anything, I would go home! Do you mean 7.4%? That’s what I get by multiplying all your numbers.
2- I’m way too tired to follow this logic, but it doesn’t seem that the percentages are calculated correctly. What “credo” are you replying to?
3- maybe tomorrow morning, with a clear head, I’ll understand the stats. Phewww – I’m tired.


Matt November 13, 2007

That game is broken, There was one case left, one was 500,000 and one was 750,000, and the bank offered 440,000.

I obviously said no deal, and opened my case, which had 500,000


daniel November 22, 2007


It seems to me that her choice was not that much misguided: she had only a 20% chance of holding the 300 000$ briefcase in her hand, true, but she also had the same chance of opening the 300 000$ on her next choice.

Thus, she had an 80% chance of opening 50 000$ or less, on her next choice, thus increasing the next offer to be presented to her. Her choice may not have been the choice of the so called > but the rational player plays an infinite number of time and can expecgt no more than the average outcome, whereas the human player plays only once. Obtaining exactly the average result on a random experiment done only once is not garanteed, in fact it is unlikely. Furthermore, considering the smallness of the initial investment, the actual risk one exposes oneself to is immaterial. Besides, one also plays for fun.


Nancy December 8, 2007

Interesting…I was playing the online game, and I had two amounts still in play, $750,000 and $1,000,000…..the offer was $568,875 Does that make ANY sense at all? Who would take that? I now know the algorithm used for the online game has to be different from the TV show, they could/would never do that. On another thought, do you think the banker on the show knows what is in the player’s suitcase and makes his offers accordingly?
Thanks for any insight.


Sal December 11, 2007

OK Everyone is forgetting the most important variable….

How much $$$ will be enough to change one’s life 50K, 100K, 500K??

This variable should be taken into consideration when deciding to take a risk to get to the next round.

Most contestants forget about this as well.


pDale Campbell December 11, 2007

“How much $$$ will be enough to change one’s life?”

Now there’s an interesting question. It reminds me of the German proverb, “To change, and to change for the better, are two different things”

I’ve heard many stories of how big lottery wins have destroyed people’s lives. I wonder how big-money game contestants have faired?


leo December 11, 2007

Sal, not everybody forgets the variable you mentioned. I don’t.
However, “playing” along with the contestant in front of your TV set at home is much different from actually making the decisions on-the-spot, under the lights, being stared-down by Howie. Add to that the director who will “stop tape” to coach the contestant into acting jubulant. YES – they DO THAT! The typical contestant cannot “think thru” the odds, or make rational decisions about a “life changing amount of $$$” in a measly 10 seconds.


Gary Huynh December 17, 2007

This contestant I’m going to tell you about makes the one in your post look smart. The contestant was offered 110,000 and she chose no deal to opt for a 50% chance of winning 200,000. That’s despite everyone in the audience and her whole family telling her to take the deal. I wouldn’t take it even if I had a 99% chance of getting the 200k case.

Some people…sigh


leo December 17, 2007

It’s strange. Contestants feel that they’ve come with nothing, so what-the-hell, go for the BIG money. The thing is.. they only get ONE SHOT at playing. There are no “do-overs”!! They take the risk.. maybe getting VERY lucky for a few rounds, then bomb-out and go home with $10. “Reason” flies out-the-window,” perhaps greed takes over. Then, they are all “smiley” on camera, but I wonder how they feel when they get home? Hide that butcher knife!!
After the “bomb,” their loved-ones ALWAYS say, “It’s okay, it’s okay.” Well, it’s NOT okay, … stupid, stupid, stupid people!!!!!


andrew b December 17, 2007

yes, the book was overwhelmingly in favor of DEAL….

but until then, how many of 170 contestants had the chutzpah to play it? ZIPPO!

in terms of suspense, compare if she said DEAL to if she said NO DEAL, and i think the results are startling!

the only way to win a large case is to either risk it in this situation, which only 2 people have done, or to get two large numbers down to the final 2, which is very unlikely!

if you actually had a 99% chance of getting the 200K, then it means the book says NO DEAL! really, what is the chance of picking a number from 1 through 100 right on the nose?


Jessica Prewitt January 7, 2008

Well I have never been on deal or no deal but I want too!!


mary wallace January 9, 2008

I watch deal or nodeal all the time . I love this show and I hope and pray one day I can be on deal or nodeal..


Tim January 10, 2008

The fundamental flaw of the position the author takes is the incorrect assertion that the goal is to beat the mean. In many scenarios, that would be a good strategy, but this is not one of them.


Jeff January 10, 2008

I just spent the latst 4 hours building a spread sheet and a mock up DonD board…I got issues I guess. Then I just did I seach to see what other people came up with I got about the same thing the poster did in addition to the increasing percentages of the mean. But they do interviews too so I am guessing statistical 6sigma types don’t make it on …


shirl January 13, 2008

I’ve read all the comments and nobody has mentioned the thing that really bothers me. How come one of the last two amounts is always in the case the contestant picks? Surely you can put all your mathematical skills to work to bring up an astronomical percentage against that happening every single show!


Howie January 13, 2008

Yeah Shirl… Since I’m also from the planet Twilo, I totally understand what you mean!!! ;-}


Shirl January 14, 2008

Did I really say that? I promise I’ll never leave Twilo again!


Nathan January 14, 2008

So I just played the game and got down to two suitcases, 500,000 and 1,000,000. The bank offered me 413,000. Glitch in the game/calculation?

Also, does anyone think that the show knows what is in the case ahead of time?


andrew b January 14, 2008

i think something is seriously wrong with the show nowadays…..

the theoretical probability of both millions disappearing from the game out of eight is 1/56…and it happened to both players who encountered it.

tonight, six cases and two millions. two rounds, two millions. 1/15 odds overridden.

the chances of this continuing any longer are astronomical!


blake January 22, 2008

the name of the show should be greed. so i watched today again on cnbc. this one with the motocross contestant. this time started with 4 $1 million cases. he was down to 8 cases left. he had to remove 2 if he said no deal. he still had 2 safety $1,000,000 cases left on the board then $75k and then low amounts. the offer was $166k. the wife said deal. he said: i’m a risk taker, no deal. boom, 1 mil gone, boom, the other mil gone. new offer – $9,000. $166,000 was well below the mean. the odds were in his favor. but he didn’t play conservative and he doesn’t get another chance to prove to us the law of large numbers would have made him rich. oh well. just proves again what i’ve been saying. (weird it seems like everytime i check the show and people risk it with no back-ups, they’ve been getting burned more than the statistical average-guess that proves the law of small numbers.) btw, it was nice of the banker to offer him 27,000 – above the mean with 4 cases totaling just over 100,00 with 4 cases left on the board. this game isn’t rocket science.


The Big Booper January 22, 2008

I also saw the motorcross guy’s show tonight on the second CNBC rerun. When he gets down to 8 cases, the odds of opening both million dollar cases are (1/4 x 1/7) or 1/28. That is a pretty good reason to continue. Of course, if he only exposes 1 million dollar case, than the offer should be $1,075,000 / 6 and then a roughly 65% factor or around $ 115,000 to $ $120,000. If both biggies not exposed then we are looking at $ 2,000,000 / 6 with a factor of say 75%-80%, that is real money, say $270,000 – $280,000. Definitely worth the risk.


The Big Booper January 23, 2008

While scrolling down past the black noise bars to get to the end of the thread to post my answer, I noticed an earlier post that commented on the desired type of contestant, not too swift upstairs, likeable and excitable.

The only members of Mensa on camera are up on the pyramid, standing in high heels and holding cases on the pedestals. Aubrie, and Laura from last year.


andrew b January 23, 2008

i might add that in five games, four people had 8 cases and 2M. they ALL lost both 1M cases. plus we have margarita, who had 6 cases and 2M; she also lost both over 2 rounds. the total chance is almost 10,000,000 to 1 against. against such odds, there is no such thing as greed!


Ursula January 23, 2008

To “andrew b,”
Could you explain how you calculated your “10,000,000 to 1 against” probability? In my head I figured it’s about a 1 in 140 shot of this happening. That’s still “rarety” but not nearly as astronomically scarce as your stats.


andrew b January 23, 2008

(1/28)^4 * (1/15) = 1/9,844,810


Nancy Sauvageau January 23, 2008

My name is Allie Sauvaeau. I am 15 years old, and my family and I live in a small town of Three Forks, Montana. There is something about your show that makes my mom obsessed, but she loves it! She has watched every episode. It’s a dream of hers. Can you make it happen?


The Big Booper January 23, 2008

Allie –

The people posting here don’t run the show, we are just viewers discussing the show. If your mom wants to get on, have her contact the producers.


Layla January 24, 2008

Hi everyone! Just bumped into this website while doing some research on Deal or No Deal for my Math Statistics class. My husband and I love the show and have considered trying to get him on it. I haven’t read through all the postings, but here’s a thought…anyone ever considered the probability of even getting up there on that stage?? This is my first Statistics class, and having quite a time figuring it all out so I really don’t have the knowledge (or patience unfortunately) to figure it out!


Morris T. Katt January 28, 2008

Layla – The probability of getting on the show is not something that math can figure out. It has more to do with:
– Having all your teeth
– Being absolutely CRAZY [or faking it]
– Displaying a totally over-the-top personality
– Having an obsession, like watermelons, so the banker can spit 3 million seeds at you
– Memorizing and screaming the phrase “That’s what I’m talkin’ about”
– Having family members console you saying “That’s alright,” after you blow EVERY large amount on the board


Stats. Stan January 28, 2008

Just a quicky…
All the mathematical analysis in the world cannot account for human behavior, especially when that human is hyped-to-win, and rational behavior has flown out-the-window.
Consider that at the beginning, the chances of picking the 1Mil$ are 1 in 26. As more and more cases are opened, if the Mil$ case is not opened, it becomes more and more likely that THIS contestant is that lucky ONE in TWENTY-SIX who has actually chosen the big-prize. The excitement builds. Logic falls. If the contestant is VERY lucky, “he” will get all the way to the final offer, where 3 cases are left. At this point the odds are 1 in 3. It NEVER gets any more probable than that [since that’s the last offer and the last opportunity to choose for the contestant]. NEVER… 33 1/3% chance.
…Replies please. My twisted logic could be missing something.


The Big Booper January 28, 2008

Just remember, if the contestant gets down to two cases, the final off er is


talk about an opportunity to second guess oneself.

And when they had the 2 hour special a few weeks ago where 3 contestants played, one after the other, they all were playing to get the highest score for the night, because the winner took home the amount in all the cases. This totally changed the dynamics of the game as all three went to the final 2, not knowing what the other 2 had done.


andrew b January 28, 2008

you see, the problem is, people don’t know about it.

they go on to the end believing it is in there because they think that they are taking home their case, and thus don’t switch, because that would violate their own reason for dealing.


Brett January 28, 2008

i’m having a deal or no deal game show night at my high school this friday and i need to tell my banker what formula to use to calculate how much to offer!!
can someone please post a formula that they have found to work or send me to a website that has it??



andrew b January 28, 2008

the banker usually starts at about 25 to 40%, then increases it to about 70 to 80% by six cases, then usually gets closer to 100% as it continues, but it depends on:

the cases opened recently (recent big amounts often drop an offer)
the safety net, or lack of one (the skewing can go quite low without a net)
the need for money (the offer can be set up to balance the contestant’s need with wanting them to either crash or go home)
the banker’s attitude (confidence, fear, disdain, etc…)
the banker’s, or contestant’s, instinct about a case’s contents (a contestant can be offered more for a case they think has a big one)
the contestant’s risk tolerance (riskier contestants get tougher to remove, so the only way to do that is to offer more)
a specific number important to the contestant (often added to the end of the offer)


Brett January 28, 2008

so are you telling me there is no actual formula to determine the number?

earlier posts said that there was?

im confused


The Big Booper January 28, 2008

I agree with Andrew B.’s analysis.

The starting point is the arithmetic mean of all the cases left (total value of all cases / number of cases.) From there, the banker adjusts up or down based on

1. What round is it. (% goes up as the game goes on). I have seen games with 2 biggies left out of 4 where the % is at 100 % of the mean or even higher. If the player flames out and has the $25,000 and two small ones,for instance, the banker has offered 110% to encourage the player walk with some dignity and some cash.

2. If the player has a bad round, say with 11 cases left and 3 biggies, and exposes 2 out of the 3 biggies, than the % offered will be lower than usual since the result is a downer. Converseley, if the player exposes none of the 3 biggies at the 11 case level in getting to 8 left, then the % offered will be higher than usual, at that level.

3. If there is a big gap between the highest and next highest, the offer will be reduced for that reason, such as $1 million and $100,000. If the player has the $400,000 and $500,000 left, that should bring an offer closer to the mean ( I am assuming the 6 or 8 case level here).

4. When there is a trick prize, such as the lime green Cadillac Escalade on last year’s show, then that will be held until the contestant is close to it, I think. That was offered to the girl when she had 1 big case left at the 6 case level, when most rational contestants walk (Brooks Leach aside) so she took the offer. Since Cadillac must have paid a fee to get the car on the show, if she had had a bad first 2 rounds, it might have come out earlier when the producers knew she couldn’t get back to the $70,000 value. And they had to get the car onstage, even if it was priced too high for the offer.

BTW, the executive producers make the offers, not the actor in the booth with the headset. Wikipedia has named the actors playing the banker in an earlier article which I can’t locate anymore; the article listed most of the contestants and what they had won. I discovered this thread while googling trying to get back to that thread, which may have been “poofed”

But you get the drift, the mean is the starting point and then it is adjusted up and down by various factors. Watch the show on a recording where you can stop the tape / disc and enter the info in a spreadsheet and you will see.


Brett January 29, 2008

Thank you very much for the help!


mdoerr February 4, 2008

I’m seeing a couple of things that the contestants are missing. Forgive me if these have been brought up before.
1st, is a misunderstanding of the odds. The odds are 1 in 26 of selecting the $1M case. The odds do not change just because the other possibilities are exposed. when there are 4 cases left, and $1m is not selected, the odds of that being in the contestants case are still 1 in 26, not 1 in 4, regardless of what Howie says.
2nd, nobody is looking at the downside risk, everybody is always looking at upside. Few contestants apparently think about what happens if a high value case is selected. The results can be devastating. If have a $1m and $75 k case left, along with a $25k and $5k case, and I pull one of the big cases. The offer drops drastically, and is ruinous. I frequently put myself in the scenario…I have just won $250000 at a casino, I am walking out the door with my winnings, a $1M game catches my attention, a game of war…would I bet $250 k on a 0ne shot chance on a Mil, 2 shot, 3 shot chance? Of course not. I’m gonna find the best dinner I can and enjoy the rest of the evening.


nick February 5, 2008


That same conclusion on the probability has been reached before about the show “Let’s Make a Deal”; this version of the problem always involves 3 doors (a, b, and c) 1 of which is your door (we’ll say door a). Your logic yields that if c is revealed as not the prize then the probability of a being the prize is 1/3 and the probability of b is 2/3. If you then consider from the initial condition that the probability of it not being a is 2/3 and the probability of it not being b is 2/3 one of 2 paradoxes either

a) the chance of b must remain 1/3 and the probability of a is 2/3 which contradicts the initial statement.

b) the probability of not a remains 2/3 and the probability of not b remains 2/3 so you will have a chance of 4/3 that a or b is the prize since a and b are mutually exclusive. Since the probability of a or b is 1 and 4/3 is not equal to 1 the probabilities do not work.

One cannot apply the logic to one term but not the other so the Probability of any case being any value is always 1/(the number of cases remaining) provided the value is not already eliminated. I’m sorry for referencing another show but I am too lazy to deal with the larger numbers of “Deal or No Deal”. Your second point however is sound (in Game Theory it is known as the “Nash Equilibrium”).


andrew b February 6, 2008

this is the problem, mdoerr…

what is the chance of picking the 1M right out? 1/26.

what is the chance of picking something else and leaving the 1M as the last case in the gallery? 25/26 * 1/25 = ALSO 1/26.

what is the chance of removing the 1M if it was chosen in the first 24? NADA.

so, since the 24/26 hasn’t happened, it’s a 50-50 shot between 1/26s.


grey.ghost February 11, 2008

ha ha – I just played it online the first time & it came down to the million dollar case & $1… the offer was in the $300Ks… but i totally kick butt and ended up picking the million dollar case (sweet)

my girlie then played – the first case out was the million dollar case.

i think the lesson here is that i’m much cooler than she is. and you’ve helped me mathematically prove it. ha! i rock!


Com2 February 12, 2008

You flip a quarter 9 times & it lands heads 9 times.
Statistics tell you that the 10th time will be heads because it’s been heads every time before. The odds will tell you it is a 50/50 shot of being heads. Common sense tells you it’s bound to be tails because it wasn’t 9 times in a row.


Shawn February 12, 2008

Com2: Statistics tells you only that the expected result will follow the probability density function of the coin toss. ASSUMING it’s a fair coin, that means that the probability of heads is 50/50. If you don’t know it’s a fair coin, then the long run of heads may give you reason to believe it’s not fair.

As for common sense, when it tells you that nine heads in a row means the next one must be tails, it’s lying. Assuming the coin toss is fair, the fact that an event with probability 1/512 (nine consecutive heads) has already occurred has no effect at all on the next toss.


rod s. February 12, 2008

Last comment [hopefully] about a coin-toss…
Each toss is a unique, isolated EVENT when considered separately. Of course the odds for coming-up heads is 50-50 [with an unbiased coin].
Over the long haul [a large sample of tosses] it is PROBABLE, or LIKELY that the law-of-averages will kick-in to produce heads coming up as many times as tails. People tend to wrongly analyse the results by not using a large enough sample of events. It’s flawed human logic.
STATISTICAL TRENDS would state that if heads has ALWAYS come up, it will continue to come up. I don’t think TRENDS is a valid analysis for arbitrary coin tossing, right?
Then, of course there is that one-in-a-million shot that the coin will land on its edge [standing up on its side], which according to that Twilight Zone episode, would allow Dick York to read people’s minds all day!


andrew b February 12, 2008

the chance that someone has picked a 1M by now is circa 95%….it’ll be up to 97% with this game.


J. J. New Eyes February 15, 2008

Ahhhhh! The program is getting tired. It’s turning into a freak-show. It makes me happy that I’m not a member of “this” human race. It’s all beginning to sound “scripted”… and the contestants are not what you’d call seasoned actors!
BTW, Last week I was flabbergasted that kissing a frog would have given the woman OVER 1/2 a million bucks… and she said “NO DEAL!”


Matt February 15, 2008

I wouldn’t be surprised if somewhere between the time a person first decides to try out for the show and the time that same person is standing on the stage they take a test like the kind so often used in job applications. I’m talking about the kind of test with over 100 questions that go through situations and ask you to fill in (1) for definitely agree, (2) for somewhat agree and so on. Just as those job tests wring you around, sometimes asking the same questions multiple times at various locations, seeking to ferret out true responses as to how honest a prospective employee is, I can imagine a test existing which filters out people who are psychologically pragmatic and therefore less likely to turn down $100,000+ while they act like a fool on national TV.


andrew b February 15, 2008

J.J., you don’t get that his wording may have suggested giving her 532K, but what likely would have happened is: the banker would have brought out another card with the FONT size of "$266,000" doubled.


Fred February 16, 2008

I just played 5 games. I was NEVER offered more than the mean, usually far below it. The game seems to be set up to sucker the contestant to stay in looking for a fair offer, based on the odds, which never materializes.

BTW, the real game gives the contestant, when the game is down to two cases, the chance to exchange his chosen case for the one left in the field. The online game gives the contestant no such choice. Seems to me this is significant, since the original choice has a 1 in 26 chance of holding the million. This means that the field has a 25 in 26 chance of holding the million. The odds are that the million is still in the field, no matter how many cases have been opened.


J. J. New Eyes February 16, 2008

andrew b – Even if the banker made a “trick” statement, still.. $266K ain’t too shabby an offer, especially when she plainly stated that her “goal” was to walk away with ~$150K.


leo February 16, 2008

Fred – regarding your “BTW” statement….. The “switch case” analysis is not as straight-forward as it appears. Sure the odds are 1/26 at the start, but as more and more cases are opened [without revealing the $1M] the odds get better that the initial case contains the million. By the time that only 2 cases are left, each case has a 50/50 shot at being the million dollar case.
I too originally thought that the odds remain 1/26 [for $1M in the original case], but many people have pointed out to me that my logic is flawed.


Fred February 16, 2008

Not so, Leo. At the start of the game, the odds are 1 in 26 that the contestant picks the mill. That means the field has a 25 out of 26 chance that the mill is still out there. Those odds don’t change just because a few cases get opened. You’d be right about the 50/50 thing if the contestant had to choose a new case every turn, but that’s not the situation.


andrew b February 16, 2008

here’s why it’s 50-50, fred..

what is the chance of picking the 1M immediately? 1/26.

what is the chance of picking something other than the 1M and leaving the 1M as the last case in the gallery? 25/26 * 1/25 = 1/26.


Fred February 16, 2008

Again, I disagree. What if there were 100,000 cases? The contestant picks one. What are the odds that he picks the mill? That means the overwhelming odds are that the mill is still in the field. The million dollar case is NOT going to magically move, no matter how many cases are opened.


Josh February 16, 2008

Fred, read this.

I posted it half way up the page a long time ago. It explains why it’s 1/2.


leo February 16, 2008

Fred [at 6:14pm] – Whenever I want to illustrate a point I too exaggerate the issue to make it very plain exactly what’s goin’ on.
So… there ARE 100,000 cases and indeed, the one you pick has an ULTRA-SLIM chance of holding $1M. That’s absolutely true.
BUT – you then open 99,998 more cases and STILL the $1M case has not appeared. {That’s incredibly rare, and in fact, will only happen to 1 in 99,998 contestants, give or take one.} The “new” odds of the FINAL two cases is 50/50. Switching your case choice will not give you any advantage. If you could set-up a computer program to play many-many millions of games and then filter out the scenario I just described, you’d see that 1/2 the time the first case picked has the $1M, and the other 1/2 of the time, it’s the last case left in the field.
The analysis is NOT the same as “the Monty Hall Problem,” where Monty has “inside info,” and shows only what he “is allowed” to show [making better odds for the contestant by switching “doors”].


Fred February 16, 2008

Josh, your link makes some pretty weird claims. I particularly like this one: “For example, if I asked you the chance of rolling a 3 with a die, your answer would be 1/6. If I asked the chance of rolling a 3, picking it up, and rolling another 3, the probability drops to 1/36 (1/6 times 1/6).”
Since the die doesn’t have a memory, how does it know how many times it has been rolled?? The fact is that every time you roll a die, you have the exact same odds as the previous roll.

When the contestant picks a case, he has a 1 in 26 chance of picking the mill. Unless he gets to make new selections, those odds don’t change.


Shawn February 16, 2008

This confusion is all due to the common, and incorrect, justification of the outcome of the Monty Hall problem. This idea of probabilities as values affixed to objects, and, even worse, values that “move” or “collapse” may make the Hall problem conceptually easier, but it really just creates confusion.

In the case of Monty Hall, the reason switching works is because Monty has given the player ADDITIONAL INFORMATION. To properly understand why this works, you need to apply Bayes’ theorem but that’s too technical for most audiences, so most expositions use this intuitive explanation that just confuses and causes these sorts of misunderstandings.

The simple truth in the DOND situation is that no matter how many cases remain, the probability that any unfound value is in any particular case is (1/# of cases left). No information has been provided to allow the player to favor any case over any other. Unless the producers of the show lie, no one could provide this information, because no one has it.

For a proper explanation of the Monty Hall problem, look at this page:


Josh February 16, 2008

Fred (8:40 2/16):

My claim was not that the second roll would have a 1/36 chance. It was that the overall chance of rolling a 3 twice in a row would be 1/36. It was just laying the groundwork for the upcoming analysis.


money-holic February 18, 2008

have any of you ever seen somebody get one million dollars on the real deal or no deal show?


Ursula February 18, 2008

MONEY-HOLIC – No, I haven’t ever seen anyone win the 1 million… but that could change within the next 11 minutes!!!


Ursula February 18, 2008

AHHHHHHH – Katie wimped-out!! So, to answer MONEY-HOLIC… No! I have never seen ANYBODY ever win the million! Tonight would have been the first time that even “Conservative-Ursula” [that’s me!] would have gone “all the way.”


andrew b February 18, 2008

well, NO, moneyholic..

otherwise why would the MDM be active?


LB February 20, 2008

Awsome. I did a quick test with Excel and noticed something while playing the online game (which is somewhat in error, ill get to that). After each offer, I compared it to the mean and noticed that it started at 20% of the mean and creeped up to 90%+. At one game, i actually got lucky and had successivly larger offers thru the game. The error in the online game is with the final offer. Twice I had 2 adjoining values, 300K and 400K, and then a 400K and 500K. In both instances, the offer was less then the lowest amount. I would love to see how the REAL banker handles that.


aprince February 21, 2008

I wonder if Fred is actually right about the odds favoring the field case containing the mill when down to 2 cases with the mill still not revealed. Isn’t this really the same thing as the Monty Hall problem?

Think of it this way: Assume the NOND game is played with only 3 cases ($1, $1 and $1,000,000). You pick a case. One of the two remaining cases is opened to reveal $1. You are given the choice to switch your case with the remaining unopened case. What to do? I think you should always switch. It’s the same as the Monty Hall problem. The only significance of Monty knowing where the goats were is that he could ALWAYS reveal a goat. Had Monty not known and just happened to guess and reveal a goat wouldn’t change the odds. The doors nor the cases nor the math care about how or why the goat or the $1 was revealed. The only important fact is that it WAS revealed…..and it’s that new information that makes switching a good decision.

I’m with Fred on this one. The fact that nobody knows where the mill is only serves to make this a very rare senario (i.e. down to 2 cases with the mill still in play). Likewise, in the case with Monty Hall where he knew where the car was only served to make the senario very common.

I would agree that getting down to 2 cases with the mill still in play would be rare, but once that senario occured the odds favore swithing. I think if one ran 1,000,000 trials and filtered out only that senario, the data would show that the field case would contain the mill more often than the origional picked case. In fact, the mill should be found in the field case 25 out of 26 times and NOT 1 out of 2 times.

…..or am I missing something?


Fred February 25, 2008

You’re not missing a thing. The issue isn’t the odds of making it to a final choice between the mill and the contestant’s choice. The issue is what to do if the contestant actually gets to that choice.


Erroneous Geek February 25, 2008

Fred & aprince – I think you both ARE missing something.
Let’s do a very-scaled-down analysis. Bear with me. This is complicated to put into writing:
Suppose that the game is played with 3 initial cases. So, from what you’ve stated, you guys hypothesize that when 2 cases remain [the initial pick + 1 field case], the odds would be 2/3 that switching cases will yield a $1M win, right?
Let’s consider 6 contestants [called A, B, C, D, E & F]. Arbitrarily, the $1M in hidden in case 3.
Contestants A&B choose to keep case 1, contestants C&D choose 2, & contestants E&F choose 3… So far so good…
-Contestant A now chooses case 2, leaving case 3.
-Contestant B chooses case 3, leaving case 2. **OOPs – the $1M case is revealed. This person is out-of-the-running.
-Contestant C now chooses case 1, leaving case 3.
-Contestant D now chooses case 3. **OOPS – out.
-Contestant E now chooses case 1, leaving case 2.
-Contestant F now chooses case 2, leaving case 1.
So, that only leaves contestants A, C, E, & F capable of winning the $1M.
–WITHOUT switching cases, contestants E & F have won the $1M.
–WITH switching cases, contestants E & F have lost.
–WITHOUT switching cases, contestants A & C have lost.
–WITH switching cases, contestants A & C have won.
Conclusion… It’s a 50/50 shot!

I figured-out this wacky analysis on a spreadsheet.
Please.. ANYBODY – prove me right or prove me wrong; I don’t care which. I just would like to know the true [switching] odds!!!


Fred February 25, 2008

I think you’re making this overly complicated. When the contestant makes his initial choice, the odds that he picked the mill are 1 in 26. The odds that the mill is still in the field are 25 in 26. When (and if) you get down to 2 cases, no matter what odds you beat to get there, the initial odds are still in effect.


Jim Nayzium February 25, 2008

I actually — Fred you are a bit mistaken. Trust me — I have had an ongoing discussion of this very topic — and my brother (who made 800 on the math on the SAT mind you…) has finally convinced me that the actual odds on the last two cases is indeed 50/50 not 1/26 like it is originally…

Here are few key points to remember —-

1. The only way to indeed win the million dollars is to go to the end…so your odds of winning the million from the onset never change…they are indeed 1/26 (assuming only 1 mill case)…

BUT —-

2. The small sub-set of times that the people actually make it down to two cases….and the million is still on the table…..50 percent of THOSE times its in your case…and 50 percent of THOSE times its in the other case.

Switching is not relevant because of the random nature of the picking of the boxes to begin with.

IF, the “house” knows and gets to look and leaves the cases every time the way the Monty Hall deal works…then you should switch….

AND here is the kicker -0– and the way my brother finally explained in a such a way that makes sense.

Let’s pretend that 2 guys get to play instead of one.

Guy A: and Guy B:

They both pick boxes at the beginning….

Guy A: picks box 1
Guy B: picks box 26

Both are picked randomly….

NOW we have to assume that after picking the other 24 boxes that the million has not been revealed or else this NOT one of the sub-set of times wehre the million is still left.

So the million is either in Guy A’s box or Guy B’s box…right?

So if you would always switch — which guy is more likely to have the million dollar box???? Seriously…

This illustration is mathematically the same as the real game once the game is really down to two boxes.

It is a different from what you are outlining…because it does indeed matter what odds you beat to get there….


Erroneous Geek February 25, 2008

Fred – I’m not over-complicating the issue. I’m stating exactly what would happen [for anyone interested, see my post a few up from here]. When I first started watching DOND, I thought like you, in fact – I would scream into the TV set [when 4 cases were remaining], “Howie, you’re lying.. The odds are NOT one-in four, they’re one in twenty-six!!” Since those days of ranting at an innocent TV, I’ve realized that the probability of actually getting to THAT point in the game must be factored into the equation. I eventually conceded, Howie is NOT a liar!
The “one-in-twenty-six” thinking is an over-simplification.. and erroneous.


Erroneous Geek February 25, 2008

{continued from my last post}
There’s one sure-fired way to settle this analytical battle. I’d do it, but it’s beyond my computer-skill-level:
Simulate 13000 games, then take the 100 contestants who will make it to “just 2 cases left.” Now, look at how many times those contestants have chosen the $1M and how many times it is in the “other” [field] case. That will once-and-for-all settle the “argument.” Will it be 50/50? Will it be 25/26? The anticipation of a solution to the problem is exciting me!!!


Eddie February 25, 2008

For those with the misguided notion that the odd remain 1 in 26 throughout the life of the game-
Every time a contestant picks a case and does NOT reveal the $1,000,000, the odds become more likely that the $1,000,000 is in a remaining case [duh]. Here’s the key element – the odds become EQUALLY more likely that the money is in a remaining case INCLUDING the initial case chosen. Those very few but fortunate contestants who make it down to the final 2 cases have an EQUAL chance that the fortune lies in their case [as apposed to the other case].


aprince February 25, 2008

Fred, I think I’m starting to see the light. Erroneous Geek’s example can’t be dismissed too easily. I’ve been looking at it, fully expecting to find an error, but I just don’t see anything wrong. I’m seriously starting to see that the odds that are beat to get to the final choice actually does matter. It seems as though the process of filtering down to the sub-set of trials that end with the mill still in play removes the bias towards the mill being in the field. I wish it had been obvious to me, but it wasn’t.

I like using the 3 case version of the DOND game to simplify things, and I don’t think it changes the concept.

Considering the 3 case version of the game:
1) There’s a 1/3 chance of picking the mill
2) There’s a 2/3 chance of leaving the mill in the field
3) There’s a 1/2 chance of revealing the mill on the next pick when the mill is still in the field.
4) Thus, only 1/2 of the 2/3 times the mill is left in the field are significant.
5) Thus, the odds of getting down to 2 cases with the mill in the field case is really only 1/2*2/3 = 1/3.
6) Thus, after beatting the odds and getting down to 2 cases, the odds of the mill being in the field case is equal to the odds of the mill being in the origional picked case.

Here another way to look at it:
1) The odds of picking the mill in the beginning are very small (1/26), but the odds of getting down to 2 cases when this happens are very large (100%).
2) The odds of leaving the mill in the field are very large (25/26), but the odds of getting down to 2 cases when *this* happens are very small (1/25).

We really have to consider the initial odds in combination with the odds of making it to the last 2 cases. It actually *does* matter. DOH!

I was wr…wro…wrong…
I know it’s hard to admit when you’re wrong. I bugs me too…really! I don’t feel too bad though. It’s not an easy thing to see. I’m impressed by those that figured it out so much faster.

Sorry for making you explain it so many times guys….

Very Warm Wishes,


Jim Nayzium February 25, 2008

Here is the easiest way to see the 50-50 of the last two cases….

Pretend the contestant pre-picks 2 cases instead of 1.

Then each time the contestant gets down to 2 cases with the million still in play what are the odds on either case having the million?

50/50 and here is why…

Case 1 == 1/26
Case 2 == 1/26

The fact that the odds were the same at the onset of the two cases…and those are the only two left — then equal odds equals 50/50…


aprince February 25, 2008

Well Jim, I can’t say that I disagree with anything you’re saying. I’m not sure I agree that it’s the easiest way to see it though. (:

I think that explaination is too simplified and leaves too much information out. We’ll see. If it convinces Fred, I’ll once again admit that I was wr…wro…wrong…(-;


Jim Nayzium February 25, 2008

Maybe not the easiest way to see it — but this is what finally brought me around…More specifically the example that brought me around was if 2 players played against each other at the start — and they each picked one case…

Then they each eliminated down to the two cases with the million being left…one turn at a time.

What would the odds be in Vegas on each player?

50-50 guaranteed…


aprince February 25, 2008

What are the odds of getting down to 2 cases with the mill still in play? I’m thinking there’s two ways to do it and each of those ways has a 1/26 chance of happening, so is it 1/13?


Jim Nayzium February 25, 2008

Easiest way to figure odds of something happening is to figure the odds of it not happening…

Yes you are right …

1 / 13th of the time that a person gets down to the 2 cases — the million is still left in the mix.

Half of those times — it’s in his box … the other half it’s in the other box…


aprince February 25, 2008

How about a fuzzy math explaination:
1) Picking the mill in the beginning is HARD and then getting down to 2 cases becomes EASY.
2) Leaving the mill in the field in the beginning is EASY and then getting down to 2 cases becomes HARD.
4) Thus, it’s 50/50


Erroneous Geek February 25, 2008

REGARDING FUZZY MATHEMATICS.. Don’t forget to factor-in the odds of a person like ME ever getting to be a contestant. … It = HARD to the 20th! Thus, winning a million [as they say in the MC commercial], PRICELESS!! ;-}
BTW – I’m glad that my A-B-C-D-E-F contestant scenario swayed you into becoming a “convert.”


aprince February 25, 2008

Well Geek,
Regarding the fuzzy math. Maybe it’s too fuzzy. I probably should have used products instead of sums. Also, the EASY’s are not exactly the same and the HARD’s are not exactly the same, but the combinations are really the same.

Anyway, I thought it might help to illustrate the basic concept that seems to be easy to miss…..especially when you’re familiar with the Montey Hall problem.

Thanks for helping to “convert” me. (:


Matt Jamieson February 25, 2008

I Think that they win the million…


Matt Jamieson February 25, 2008

or…. not


Eddie February 25, 2008

Hmmm. Howie just announced, “The million dollar mission is over.” But why? Way long ago they started the mission and Howie announced, and I quote, “We’re not stopping until someone walks out of here with the one million.” After SIX contestants did not win [with seven $Mil cases], Howie then said that “this part” of the mission is over. EEEK!! Then a month or so ago, they continued the mission. Did they start with EIGHT cases? NO! They went back to the beginning with TWO… and as of last week, increased it up to THIRTEEN cases. Still no winner. Tonight Howie said that they were repeating the thirteen case mission, even though the mission was supposed to end. End? Repeat? Why? Well, as of ten minutes ago, THIS LATEST PAIR of contestants ALSO did not win the $Mil. Do the producers think that the viewers are DOPES? I remember the “rules.” What – they can make up the rules as they go? I recorded the show where the original statement was made. Yup – “MISSION WILL NOT END”… But, it has ended. LIAR-LIAR-PANTS-ON-FIRE!
Well, that’s my rant… So, what do ya think?


andrew b February 25, 2008



aprince February 25, 2008

The 209K seemed to be right at that critical decision point for those people. I too would really like to know how they came up with that offer.


max February 25, 2008

i think the game is rigged . they always just screw them out of the million


Fred February 25, 2008

Let’s look at this from another angle. Suppose the contestant gets down to the last two cases and the mill is still in play. The contestant decides to switch cases. Now the ONLY way he can lose the mill is if he chose it initially. We all agree that the odds of choosing the mill at the beginning is 1 in 26, so to lose it in this manner would mean that he has to beat odds of 26-1. I sure don’t see any 50-50 there.


aprince February 26, 2008

Okay, I actually took the time to simulate this.
I wrote a windows application using C++ (It’s what I’m most efficient with). I did this exercise to validate my own understanding. I’m posting this because I thought some others would like to hear about the results. Let me know if you want the program and/or the source code.

I wrote the program to do the following:
1) randomly assign the dollar amounts to the 26 cases
2) randomly select a case to be the picked case
3) randomly select a field case to be eliminated
4) repeat step 3 until only two cases are left (1 field and 1 picked)
5) keep count of the number of times the mill was in one of the last two cases
6) keep count of the number of times the mill was in one of the last two cases AND the picked case
7) keep count of the number of times the mill was in one of the last two cases AND the field case

After only 1000 trials, the answer became pretty obvious.
Want to see the results?
I’ll show you……………..right after this short break…………………..

Just kidding (here’s the results):
Games Played = 1000
Games Making it to 2 cases w/ Million Still in Play = 75 (~1/13)
Games Down to 2 cases w/ Million in Picked Case = 36 (~48%)
Games Down to 2 cases w/ Million in Field Case = 39 (~52%)

Running more trials really didn’t change much. Here’s the results
after 20,000 trials:
Games Played = 20000
Games Making it to 2 cases w/ Million Still in Play = 1471 (~1/13)
Games Down to 2 cases w/ Million in Picked Case = 740 (~50.3%)
Games Down to 2 cases w/ Million in Field Case = 731 (~49.7%)

Warm Wishes,


Erroneous Geek February 26, 2008

Well aprince, I asked for a computer simulation… and you gave it to me. Your analysis plus my 3-case scenario both prove BEYOND A SHADOW OF A DOUBT that the odds are 50/50 [where the $1M is hiding]. As far as I’m concerned….. CASE CLOSED. Based on your computer findings, I don’t see how anyone can continue to argue the point… but I’m sure some will!! ;-{
The reason that the odds are not a “simple” 1/26 is because the “sampling” being considered [the sub-set, some call it] is chosen based on previous information [if the $1M case is revealed along the journey], which alters the results.
Example.. If I instructed a group of people to each flip a coin 10 times with the constraint that ONLY those people who turned-up TAILS the first 9 times could continue to try that 10th flip, the probably-analysis [for that 10th toss] of the very-limited-sample-group is WAY different than if all the people were included in the sample, even though simple-common-logic tells you that the 10th flip still has 50/50 odds. WOW – even *I* do not understand what I just wrote!!!! I’ll have to think about THIS for a while.
ANYWAY – Thank you for a GREAT computer analysis!


Erroneous Geek February 26, 2008

{MODERATOR – feel free to delete this post if it is too off-the-topic..}
There has been a lot of discussion regarding this 50/50 last-case issue lately. Words and logic can twist thinking, and it’s not always easy to “see” the true outcome.
Here’s a story where the words lead you in the wrong direction:
3 guys take a $30 hotel room. [They each pay $10.] Later, the clerk discovers that the room-rate was only $25, so he sends the bellhop to the room to return $5. Along the way, the greedy bellhop pockets $2 of the $5 [leaving $3], so at the room, he gives $1 back to each man.
NOW – each guy has paid $9 for the room. So, $9 times 3 = $27….. PLUS the bellhop’s $2 = $29 total. . . . . . . . What happened to the thirtieth dollar???
The above is a cute story, but how does it relate to the past posts? Well, the moral is – it’s not always immediately apparent exactly what’s goin’ on, mathematically, when being steered in the wrong direction!! {You finish the moral.. I’m really bad at writing fables.}


Jim Nayzium February 26, 2008

Here is another simple way to understand the logic behind it all…

In the Monty Hall example — Meaning when it is RIGHT to switch —

100% of the time it will be narrowed down to two cases…one with the Mill and one without —

If after you chose — the banker says — ok — I gaurantee you that one of the two cases you picked is now in the sub-set of two we just narrowed it down to….then you should switch immediately…and the 1/26 vs 25/26 would be in play.

The difference is the only examples that matter are the small sub-set of times when it comes down to the two cases.

If the game were:

1. You pick a case.
2. House eliminates all cases but one other and gaurantees you the million is still in play…

Otherwise — as is it is 50-50

Here is a valid way to look at it also —

VEGAS is never at a disadvantage….THEY WOULD NOT let you switch if it slanted the odds that far in your favor!!!!


Rebeca Lopez February 27, 2008

Deal or no Deal is the best show ever!!!!!!!!!!!!!!!! and I hope to be in the show some day.


Anna February 29, 2008

my mom loves deal or no deal and she really wants to be on the show! would u no of a way to get on the show?


Radames Candelaria March 2, 2008

i would like to participate on deal or no has been dificult for me and i would really like to turn my world around and be able to make a difference. i would like to propose to my girlfriend, cheri and be the lucky one to win the million dollars. please consider me for the show, it will change my life forever


andrew b March 2, 2008

you know, this isn’t where you sign up.


Anna March 4, 2008

yea i no andrew b i just asking if u no how to sign in or anyone let me now please!


banker March 4, 2008

bank offer= mean of the board – % deduction assigned for the round +or- the player’s psychology


andrew b March 4, 2008

go to and there’s a tab that allows you to begin the signup process..

meanwhile, BAD GARRETT!


Anna March 4, 2008

thxs-andrew b!!!!


Kim March 5, 2008

I have always wondered if the $ amount is placed into a case with a number on the outside or are the numbers attached to closed cases. In other words is it truly random or assigned.


That Guy March 11, 2008

DUDE! this is the coolest website EVER!


Erik March 11, 2008

One other statistic revelation having to do with Deal or No Deal.

Once you pick your first case, there is a 1/26 chance you have the $1 million. This means that the other field has a 25/26 chance of having the $1 million, or if you knock that out, the next highest value. As you decrease the amount of cases in that field, the statistical probability of that field remains the exact same, or 25/26.

This means that at the end of the game, YOU SHOULD ALWAYS DECIDE TO SWITCH CASES. For example, if the two values left on the board are $1 million and 1 cent, there is a 1/26 chance that you picked the $1 million on your first try, and a 25/26 chance that you got it wrong on your first go. It is 25 times more likely to be in the case you didn’t pick. Switch the cases if you know whats good for you.


Shawn March 11, 2008


This has been discussed, read the posts above.

The fact is that it doesn’t matter whether you switch or not. This is not the Monty Hall problem (and the logic behind your argument may work for Monty Hall, but it’s not correct there, either. It gives the correct answer, but it’s still wrong).

If you need to convince yourself, read about the results of aprince’s simulation.


Jim Nayzium March 12, 2008

Erik’s response and logic are in fact correct for the Monty Hall scenario. However, they are incorrect for Deal or No Deal. The very fact that the ‘HOUSE’ has the knowledge of the correct case in the Monty Hall scenario changes the odds.

If you could CERTIFY that every time you were to play the game that that house would indeed narrow it down to 2 cases — one with the million — and one without — then the odds would always remain 25/26 and you would ALWAYS want to switch.

HOWEVER — deal or no deal — you can CERFTIFY that the EVERY time the two left will have the million….

SO – Mathematically — (I promise!!) — The sub-set of times that it does indeed come down to two cases left — HALF of those times it’s in your case — and half of those times it’s in the other case.

If you have a hard time picturing the odds — pretend a Husband and Wife team were allowed to Pick 2 cases…each got a case.

This very time — it came down to two cases…and ONE had the Million….Either the husband’s case had the million or the wife’s case had the million.

Which would be more likely to have the million??


Jim Nayzium March 12, 2008

Actually there is a typo above — sorry ….

Please re-read this one

HOWEVER — deal or no deal — you CANNOT CERFTIFY that the EVERY time the two left will have the million….


Shawn March 12, 2008


The logic described for the Monty Hall problem isn’t really correct. It describes probabilities as being “stuck” to objects and “shifting” when decisions are made. That’s not how the math works, and it is easy for people to be misled by it and erroneously apply similar processes to other situations. It works fine as a sort of an informal justification of the correct answer to the Monty Hall problem, but the correct way to work it out, the way that will *always* work, and not mislead you sometimes, is to look at it in terms of conditional probabilities and apply Bayes’ Theorem.

An example of the right way to solve the Monty Hall problem is at

That’s too mathematical for most, so we fall back on the more intuitive explanation, which works but is rife with pitfalls and easy to erroneously apply where it doesn’t work. Perhaps I overstate my case a little to say that the intuitive logic is wrong, it’s actually the way the logic is usually described that is wrong, and the fundamental observation that Monty is giving the player information by his choice of the goat door to open is usually glossed over, making errors like Erik’s seem entirely reasonable.


Jim Nayzium March 12, 2008

I didn’t take the time to actually read the Monty Hall logic thread — as I have had this discussion with my GENIUS (freeky weird GENIUS) brother for over a year.

Basically what I was saying was the math in terms of Vegas Odds in the monty hall scenario actually increase by switching…and in DOND they in fact do not.

Does that make sense?

Shawn — do we agree on that?

DOND — 2 cases left — odds are 50/50

Monty Hall 2 Curtains left — your current = 3:1 other curtain = 2:1


Jone See March 16, 2008

How much do the girls that hold the numbers make on Deal or no Deal?


Luke Mansfield March 17, 2008

My dad and I got into a discussion while watching the show about the way that it starts. Do they make you pick 6 cases because it statistically will lower the mean, or just to speed up the game?


Nancy March 23, 2008

Does anyone know how much the girl from Bayonne won last night? I had to go and pick up my daughter and couldn’t find a blank tape to tape it. When I left, she still had the top 6 or 7 amounts in play. It might have been a repeat, but I hadn’t seen it. It was the show with the 70’s and 80’s theme, and she was playing during the 80’s segment. She had her daughter there, who they had up with the models. If anyone remembers, I would love to know.


andrew b March 23, 2008

it was a repeat on CNBC..

the last thing you expect for her to win: $55,000!

she had 1K, 5K, 50K and 200K.

eventually, she had 1K, but not before seeing an offer of over $100,000.


I pwn all! March 23, 2008

I won $2,500,000 on the online version of Deal or No Deal: Meal or No Meal!!!!!


andrew b March 25, 2008

anyone reading this needs to BOYCOTT DEAL OR NO DEAL after that DESPICABLE piece of filth they aired last night!


sankajam March 31, 2008

hellohow much did todd win tonight?


aprince March 31, 2008

I think he accepted an offer for $198,000 with 3 cases left, including $750,000. Not a bad deal really…


sankajam March 31, 2008

we have ti- vo and we paused the dad burn tv to answer the phone and it switched so he left with that I guess that was a smart move. He can @ least go to school. thank you very much I searched all over to find that answer. ty,ty,ty


Matt K April 3, 2008

My wife and I (both of whom attended MIT) had a “discussion” about this the other night. I took the position that the odds of having the million dollar case don’t change from the beginning and she said they changed every round.

After reading many of the posts, I too have changed my mind about the probability of having the million dollar case when there are only 2 cases at the end – it is 1/2, not 1/26. Here’s my explanation:

Let’s use a lottery example with 30 numbers in all. Let’s say you need to pick 6 correct numbers (in any sequence) to win the megamillions prize. You know the chance of picking the correct 6 numbers are one in several million. The first five balls come out and you’ve gotten all 5 correct. What’s your chance of getting the grand prize then? Is it still one in several million as it was in the beginning or is it simply 1 in 25 (the remaining number of balls)? I would contend it is now 1 in 25 that you will win the grand prize, because you’ve already beaten the odds and gotten 5 balls correctly.

In this scenario as in the show, just getting down to the final 2 cases without revealing the million dollars, you’ve beaten the odds. I suggest that those people who get to the final 2 cases have a 50/50 chance of having the million dollar case, not 1/26.

Another way to look at this is if the million dollar case is revealed halfway through the game, then your odds of having it are zero, so the odds must change along the way, based on the information that is revealed.


Patrick April 6, 2008

One problem with your calculations is that you said that the woman had an 80% chance of walking away with less money than her offer. My guess is that you said this because you figured that she had only a 20% chance of holding the $300,000 case. This is wrong because the woman actually had an 80% chance of making more money because there was an 80% chance that she would choose a case other than the $300,000 one, which would cause her offer to go up. There was only a 1/5 chance that she would make less money than she was offered, unfortunately she hit that case. The one thing to remember that this show isnt based on what case you end up with, it may just be how much you sell your case for.


glynda April 12, 2008

I think the young lady who held on to the $1000 case after three rounds of a almost losing to under 30k after er mother and three friends begged her to take the offer is a classic form of insanity or greed


andrew b April 12, 2008

yes, BUT…

how many people have taken home a six-figure case (other than 100K)? ONE.

how about the people who deal for a six-figure case? 89!!

do you not understand why we need those kinds of people?!


Patrick April 12, 2008

What are you trying to say andrew?


Doyle April 12, 2008

Anyone play Texas Holdem? If you go all in with pocket 5’s against pocket aces, your odds of winning are somewhere around 17-20% right? Now what if a 5 hits on the flop, would you say that you still have a 17-20% chance of winning? Of course not. The same thing with DOND, once more information is revealed, the odds of our starting hand (or chosen case) change as information is presented. Its foolish to think that the same odds exist throughout the entire game.


andrew b April 12, 2008

also, by the same principle, putting $50K into a $750K pot that you have a 20% chance of winning is worthwhile. marybeth didn’t do that when she looked at that 341K offer. the show needs that person, but if even THEY will not play for the million, the devil take it!

and what i am saying is that we need more people to take home bigger cases, so demetres was a welcome entry even if she was a bit insane.


Patrick April 12, 2008

Whos marybeth or demetres? The girl in the story was offered $80,000 not 341


andrew b April 12, 2008

you obviously don’t watch it enough to know about this!


Chad S April 20, 2008

I have read a lot of the post and just wanted to put this out there. Some have talked about switching the cases at the end. 1M left and .01 left do you switch the cases. The answer is Yes, every single time. Before you get all upset, hear me out. At the start you have a 1/26th chance of picking the 1M case. Everyone agrees with that. 25/26 you did not. So if they asked you to pick a case, and after you picked they said would you like the 25 on stage or the one you are holding you would obviously say I would take the 25 on stage. That is what you are doing when there are two case’s left. You are choosing the 25 that where left on stage. And if it is a 1M case and a .01 case you should be damn happy because you are most likely to win a Million bucks!
You are not reseting your odds from the beginning choice unless you get to chose again. No matter what info you learned you have to choose again to reset your odds. So if they took your case and the one left on stage and randomized them and then had you choose a case between 1M and a .01 case then it would be 50% chance of choosing a 1M case. But when you choose at the beginning you have a 1/26 chance and that will never change no matter the case’s that are shown or anything else it is 1/26. You can prove it to yourself if you want. Take 25 red cards and 1 black card. Mix them up choose a card and set it aside. Now to simulate the scenario of the 1M case left (black card) and your card (not the million) look at the cards and leave only the black card. Now if you switch you would win 25 out of 26 times. If you Keep your card you win 1 out of 26. It does not matter if the removal of the cards is random. Give it a try. If you think it does matter if the removal of the cards is random I suggest only using 5 cards, or three. Good luck


Erroneous Geek April 20, 2008

Chad.. and the other non-believers – Switching cases will not better your odds of winning, period. It was proven in a post a while back where a computer simulation was run. It was logically shown by me also, around the same posting-time.
Why, and I hope this is the last time writing this, is because.. in order to GET down to the last 2 cases, the contestant must NOT open the $1M along the way. This will only occur to 2 of 26 players. The fact that 24 people bomb-out must be included in the equation. NOT opening the $1M early continually changes the odds that the $1M is in the “remaining” 2 cases, of course, until it’s a 50/50 crap shoot!!
Think of this – suppose that 26 contestants were required to open 24 cases in the first round. How many would STILL be playing for the $1M after that round? TWO! What amounts are still left? Right.. a $1M case and another case. So it’s 50/50, 50/50, 50/50, 50/50… ahhh – – FIFTY – FIFTY.
{I love to rant!!!}


Patrick April 20, 2008

Sorry chad but your wrong. A good way to understand it would be doyles comment about poker if you play at all. The difference between this and the monty hall scenerio (which we all know that you got the idea from) is that monty hall knew what cases to eliminate. When you guess and get rid of the cases your odds change. What you dont understand is that every case has the a 1/26 chance of being the million or any other case for that matter. Say you were trying to pick the penny case, would you still switch? If you think about it enough youll figure it out.


Erroneous Geek April 20, 2008

Rant – Part 2:
Here’s another way to look at it. Let’s alter the rules… Suppose that the player did NOT choose a case FIRST, but secretly thought of a case #, let’s say CASE 1. [The player will therefore leave CASE 1 in play] Now, this very-lucky contestant manages to avoid picking the $1M case and gets to the point where there are only 2 cases left, CASE 1 and CASE x.
Now scratch your heads…. Is either case more likely to hold the $1M than the other?
Hint: “NO,” it’s a 50/50 deal.


Chad S. April 21, 2008

I think, I was told about the Monte hall, and applied it incorrectly to DOND. In DOND the house is not making sure that the Mill is still out there. Unlike Monte Hall. Crap I think I owe my brother 50 bucks. That sucks.


Morgan.G April 24, 2008

I dont know if she thought of this but here is my view on it. The banker quoted her a price above the mean of 80k. U would take it based on probability. But the probability of choosing another case that is not 300k is 4/5. So u actually have a good chance of increasing the banker’s offer even more. If the banker offered like 170k when there are 2 cases left of $1 and 300k, then you would take the offer.


andrew b April 24, 2008

….or if the banker offered 341K with 2 cases of 1M and $25, then you wouldn’t! :P


Patrick April 27, 2008

I would take the 341 because theres a 50% chance you end up with 25$ or a 100% chance you get 341k.


andrew b April 27, 2008

with a 50-50 coin flip, which is riskier: 341K to 1M or 480K to 1M?

and which is easier to endure: 480K to 25 or 341K to 25?

it’s not often you get to go for the million with this little risk, short of having another large number in play.


andrew b April 27, 2008

you say that you would still take it with the 341K in play. how low will it have to go? 250K? WELL, THAT WON’T HAPPEN. so take this chance!


Patrick April 27, 2008

But think of that 341k as your own money, would you bring all that money to a casino and try to win a million? Maybe you would, but I would be happy with 341k, but completely miserable with 25. Ya a million would be great, but I’d rather not risk it.

And what do you mean by how low will it have to go?


andrew b April 27, 2008

would you rather make that same decision with more money to lose and less money to gain?

would you be more miserable losing 340,975 than losing 499,975?

this is the game it came from:

is $75,000 not better than $25? sure it is. but is it an equivalent improvement against $1,000,000 over $75,000? NO! that’s why you don’t take it. the offer, though, isn’t going that low, so you have to settle pretty much for this situation because it isn’t getting any easier!

and if you look at the other grounds she had to go on, it’s not getting any easier there either!

this is what you are suggesting:

is $3 not better than $1? yes, it is. is $3 worse than $10? yep.



andrew b April 27, 2008

if i just come out of the blue and offer you $597,000, that would be great.

but now look at the board:

[$50,000] [$3,000,000]

that should change your decision-making process quite considerably (not necessarily the decision itself, though), and if it doesn’t change anything, don’t even consider signing up for this show!


Patrick April 27, 2008

if the 50,000 and 3,000,000 cases were left over and i was offered 596,000 then i would probably still take the 597,000 and ill tell you why. Obviously if this choice was made 100 times then it would be a wise decision to risk the 597,000 on a 50% chance that you get 3,000,000 thats because over the course of those 100 times you would win considerably more money. But we only have one chance and that 50% of losing the money is still there. Obviously the fact that the least i would walk away with is 50,000 makes picking the wrong case not as bad as winning $25. Im not saying it wouldnt effect my thinking but i wouldnt risk 600k on a 50/50 shot when i would be fine with taking that.


andrew b April 27, 2008


would you rather have to make that decision where winning yields $1,550,000 and losing loses $1,400,000 (an offer of $1,450,000) as compared to winning $2,403,000 and losing $547,000?

so why take $597,000?

…..especially when you say that you have it just before hitting the button. you know the danger, you said it, then you can’t skip out!


andrew b April 27, 2008

and that same principle of winning more overall, but having only one chance, and thus skipping out would challenge even the most rudimentary NO DEALs in the book, simply because of the CHANCE of losing money. AT ANY TIME.

even if you have the top two amounts to the end, you’re risking $125K on a coin flip!


andrew b April 27, 2008

seen the players such as ryan cleghorn?

he quit at 97K with five cases, including the last of 12 1M cases. he had previously received five offers of just under or above $200K and one offer of $144K, which ALONE should sink the deal to any serious player.

for those who say “you’ll lose EVERYTHING if you hit the last 1M”, you actually have to LOOK at how much that everything is.

the last offer was $244,000. so you just took a BIGGER drop than you possibly could by losing “everything” on the next case, and maybe even the NEXT case as well if the banker can’t correct his calculator. that should also be a red flag to avoid this offer.

some people still say that 20% is a pretty risky proposition. well, every decision he has made up to this point has been met with a greater chance of running into a 1M case (because there were 12 at the beginning), so why are you not mentioning any of those as offers he “should have taken”?

the next small case returned the offer past $200K! and the playout went all the way to two amounts, seeing an offer of $488K (which, incidentally, happens twice as often as losing the “everything” that happens the least often, that is less than the previous loss!) before his case was shown to have $500 anyway!


Patrick April 27, 2008

Because its not just about the odds. Would you rather take a million dollars or flip a coin and go for 100 million, but if you lose you get pushed off a cliff. Would you risk it even though the odds are supposedly there money wise. All im saying is that 597, 000 is a good amount of money and when you turn down that offer you are risking on a coin flip. I would just take the 600k, theres no right and wrong decision, I’d much rather get the gaurenteed cash. A bird in the hand….


andrew b April 27, 2008

not when a factor outside of money is introduced into the game. that spoils all odds.

that is also why these “special opportunities” such as that comic book drawing should not be in the game. the banker has to pay out nothing extra while increasing the chance of getting a contestant out. only more dollars should be able to do that.

besides, have you ever heard of people who lose out on a bunch of money and get murdered?

i think that big money winners rather than losers are the ones that get shot.


Patrick April 27, 2008

“would you rather have to make that decision where winning yields $1,550,000 and losing loses $1,400,000 (an offer of $1,450,000) as compared to winning $2,403,000 and losing $547,000?”



Patrick April 27, 2008

But thats the thing, with money comes other unmentioned factors. If someone had alot of debt and that 600k could get them out of it, that gaurnteed money would mean alot more to them which would influence the offer. Its all about how you view the money. If the offer was 600 million and i had a 50/50 chance of winning 3,000,000,000 or 25$ then i would just take the 600 million because its already a significant amount of money and not worth risking.


Shawn April 27, 2008

If you got to play the game a zillion times, then the answer is clear: If the bank offer exceeds the statistical expected value, then take it. If not, don’t. The math is clean and simple, and it’s a winning strategy over the long run.

Since you only get to play the game once, though, you have to make a judgement call based on your own risk tolerance and goals.


andrew b April 28, 2008

that math, however, cannot be ignored when the offer starts getting ridiculous, as it did for marybeth.

most people in that situation would have to risk $500K to get another $500K. marybeth only has to risk $341K to get another $659K. the result is the same whichever case she wins, but the way that it is scored is a LOT less risky when the offer is 341K instead of 500K, and winning the $25 is a lot less painful when the offer is $341K instead of $500K.

as for the $600,000,000, that is still obviously takable.

and people who are in a lot of debt obviously aren’t going to draw in viewers precisely because that’s what will happen to their games!!


Patrick April 29, 2008

People in debt draw in the most amount of viewers because people want to see them win.
I wouldnt say that winning the 25$ would be less painful at all when the offer is 341k instead of 500k. To me they would both be a lot of money that was lost on a coin flip. Think of it this way, would you sell your house and your car and put all the money on a coin flip to triple it all up? Those are good odds, but i doubt most people would do it.


andrew b April 29, 2008

that’s money they already had and then put at risk, and possibly dropping into negative. that doesn’t exist in DOND.

they may want to see them win, but compared to normal people, they will win a LOT less of the time!


Patrick April 29, 2008

But the offer is also money that they basically already have. That offer is money that can be theres if they just take it. So in a way they are losing money.


andrew b April 29, 2008

there’s a difference. you come in with nothing, you build up $175,000 and possibly lose it. you’re no worse off overall.

as for the house gamble, you start with nothing, put yourself deep in the red, and hope to win enough to cover that deficit and then some. that’s different.

the game is warped and destroyed when:

* the money moves into negatives. who is going to risk leaving DOND with less money than before?! that’s why there’s no such game shows existing now.
* extra contingencies are added to the offer worth no money. imagine being offered one chance to speak to your estranged father who you haven’t seen in 20 years! that would be worth a lot more than $1,000,000 even to some people!
* additional items that are worth some money but important enough as to get a contestant out a lot more often than simply adding that amount of cash on (eg arthur joseph’s miami heat package, which along with the 10K gets a lot more people out than simply offering $70,700)
* the money goes so high as to cause too many contestants to abandon the theoretical math, which unanimously calls for NO DEAL, and quit because of the massive gambling involved. eg: $1,000 and $5,000,000. is $1,300,000 enough to get out almost everybody despite the fact that it is barely half of the mean at this point?
* extremely rabid and insane supporters which twist the contestant’s mind away from the proper decision. why do you think that we haven’t had any millionaires? probably because of marybeth’s fiance! he can’t trust her with her instinct after being with her for seven years!


Heal or No Heal May 3, 2008

I just found two interesting scholarly articles on DOND, see


Amarya May 9, 2008

We’re doing a data project on deal or no deal and this site has been incredible.. Could you explain how you were able to get the Standard deviation?


KhAoZ May 12, 2008

You can get a “always switch” result if you change the problem slightly. Lets say that at the beginning of the game you pick a prize. This can be _any_ prize. Then, lets say that you eliminate all boxes except for the top prize. So then basically, your picked box and the one remaining is either the top prize or another prize thats lower than it.

In this case, because you only had a 1/26 (or w.e number of prizes there are) chance of picking the top prize (in the beginnings of the game), you would want to switch because theres a 25/26 chance the other box is the top prize.

That basically gives you the same scenario has the Monty Hall problem. The reason it didn’t work before was because we were limiting the problem to only 2 paths, where you either pick the lowest prize first or the highest prize first.

I’ve even run a computer simulation that proves this result.


Erroneous Geek May 12, 2008

To KhAoZ – Are you replying to a particular posting? Well, no matter… Follow this logic [or nonlogic!]. At the start of the game the contestant picks a case. There is a 1/26 chance that it holds the top prize. EQUALLY, each of the remaining also has a 1/26 of containing that prize. As more and more cases are picked, without revealing the top prize, the odds improve in EACH the remaining cases AND the contestant’s case.. EQUALLY.
The Monty Hall Problem does not apply here.. totally different scenario.


KhAoZ May 12, 2008

Formatting messed up (Guessing the < turned into html or something)

You can download the file here:


Jim Nayzium May 20, 2008

OK … Odds discussion continued for the non-believers. New example.

You are not on Deal or No Deal. You are a contestant on your own new show called — WTF is up with DOND?

OK — you sit – sequestered in a room watching a taping of the show.

Once Howie and the other moron pick a case you get to then place a bet on your own case. The one YOU think will have the million.

You have to wager your own money – so you bet 100 bucks. They say if you win (by having the million in your box you get the prize money — which would be 2600 dollars).

So, on your show — it gets all the way down to the two boxes left. Yours and the real contestants. The million is still left.

AT that moment — Your show stops and you are posed with a question.

Would you like to increase your wager from 100 dollars sir? The payout if you win is 26 times your wager.

The two cases that are left now are the penny and the million.

RIGHT NOW — the “House” is paying 26 to 1.

Would you then up your wager or not?



and a better rinkle —

What if the house says — OK — since your original box had a ‘tougher time’ being the million than the one left — We’ll let you switch. Do you think they would run different payouts on the switched box?

NO – because both had equal odds to begin with — IF they did it would just be to sucker you into making a stupid play with your odds.

Bottom Line — 2 boxes left ====== 50 / 50


aprince May 22, 2008

Hi Mr. Nayzium,
Remember me — I did the simulation awhile back that supported the 50/50?

Anyway, what the hey, are there still unbeliever about?…or are we now killing time by comming up with creative ways to explain the right answer? I’m guessing it’s a little of both.

That said, I think your recent explaination is brilliant. Maybe the best I’ve read so far! Love it! Mind if I put my own creative spin on it?

Your on a new show called “Switch or no Switch”

step 1: You are presented with 26 cases as in DOND and asked to pick TWO cases at random….one at a time.

step 2: All the lights in the studio align to create a powerful light beam that vaporizes all other cases so that only the two that you picked are left on stage.

step 3: Howie slowly walks up the steps, peeks in both cases and tells you that the $1M prize is in one of them. In other words, you got VERY LUCKY!

step 4: Now, Howie asks you to choose one of those cases to be your very own.

Question: Which case will you choose?

Will you choose the one that you randomly picked FIRST in the beginning or the the one that you randomly picked SECOND in the beginning?

It should be obvious that it doesn’t matter. You have a 50/50 shot at the $1mil.

Once you see that “Switch or no Switch” is statistically the same as DOND with all the fluff in the middle removed, you will also see the following fact:
IF (the all important IF) you make it to two cases with the $1mil still in play in DOND, the odds of either case having the big prize is 50/50.


Jim Nayzium May 22, 2008

I agree —
AND thanks for the props!

Here is my question to the real mathematicians out there – because I am just a dude that likes discussing this stuff…and not as smart as I pretend to be…

In aprince’s example above …

Are the odds actually different on the two cases because of the following information — or is there a reason why the next stuff balances itself out…

We all agree that the sub-set of times that we get down to the two cases one with the million is what we are discussing….

SO —

That means that some times the case will have been picked first and sometimes the case will have been picked second – the case=the one with the million.

So, the times the case was picked first — it was picked 1/26 —- and the times it was picked second it was picked 1/25…

I truly don’t know the answer here — but does that mean that the odds on the two cases when picked chronologically are different??

Or is it the same as literally picking the cases simultaneously?


aprince May 23, 2008

JN – It’s my understanding that if the two picks are made *randomly* it makes no difference whether they are picked one at a time or both at the same time. At least not in terms of the information KNOWN.

If some new information was introduced, after the first pick, that would influence the second pick then the story would be different….

For instance: If the contents of the first case picked where revealed before the second case was picked, the *known* information would change and thus the *known* odds would change. In other words, once new information is revealed, the problem space changes and thus the odds need to be recalculated. If you’re picking the two cases randomly, no new info is obtained.

This is different in the Montey Hall game, where someone who knows where the prize is located picks that second case for you AND does so such that the prize remains in play. In other words: Some of the additional knowledge that the host has, is given to the contestant before the second case is picked.

What’s different about DOND or SONS, is that NOBODY knows where the $1million is located until it’s actually revealed and thus it’s very rare to actually make it down to 2 cases without eliminating the big prize.

Becuase there’s no knowledge of where the $1million is located, it doesn’t matter *how* we get down to two cases: just randomly keeping 2 cases is the same as randomly eliminating 24 cases. Either way we are just guessing anyway!

Imagine if Howie let you pick a case and then brought you a second case and then told you the $1million dollars was in one of those two. Now it’s no longer a 50/50 because you have new knowledge. Howie has basically told you where the money is. Without that, you’re just guessing. It might seem like you’re gaining knowledge along the way as you eliminate cases and that’s partially true. However, most of the time that new knowlege is “you’re not going to win $1million”!

Because the elimination of cases is done randomly, eliminating 24 cases one-at-a-time is no different than randomly eliminating 24 cases all-at-once. Again, when doing either, MOST of the time the $1million gets eliminated.

So, pick two cases at RANDOM and either case will have an equal chance of containing the $1million (1/26). No matter what you do to the other 24 cases, that fact WILL NOT CHANGE.

That’s very different from picking one at random and then picking the other such that the $1million is gaurenteed to remain! That’s like Howie picking cases for you and helping you by NOT eliminating the million dollar case!


KhAoZ June 2, 2008

Okay, I’ve turned my code into an applet, that you can see here:

This is to prove my previous post:

” You can get a “always switch” result if you change the problem slightly. Lets say that at the beginning of the game you pick a prize. This can be _any_ prize. Then, lets say that you eliminate all boxes except for the top prize. So then basically, your picked box and the one remaining is either the top prize or another prize thats lower than it.

In this case, because you only had a 1/26 (or w.e number of prizes there are) chance of picking the top prize (in the beginnings of the game), you would want to switch because theres a 25/26 chance the other box is the top prize.

That basically gives you the same scenario has the Monty Hall problem. The reason it didn’t work before was because we were limiting the problem to only 2 paths, where you either pick the lowest prize first or the highest prize first.

I’ve even run a computer simulation that proves this result.”


Jim Nayzium June 3, 2008

No offense KhAoz, and I really appreciate your work on the applet — but maybe you want to take a little time and actually read this entire blog before you post an applet that will be proven faulty just a few lines above.

The random nature of the picking the 25 cases does indeed make the odds on the last two exactly 50-50.

You have failed to realize that we are only ever in the scenario where the last two cases have the main prize a very few times out of a hundred.

During the sub-set of times that we make it that far — the odds of the boxes having the big-prize are exactly the same.

seriously — don’t post again without referencing some of the well thought threads above —


Jim Nayzium June 3, 2008

I would also like to clarify KhAoz,

That your applet didn’t load for me on the Safari Mac I am running — but I read your first paragraph and it is clear that you are only dealing with the sub-set of times that it actually comes down to the two boxes.

Each time of these sub-sets — it will be evenly distributed between the two boxes. It will NOT be in the “other” box 25 out of 26 times. If your applet does prove that it will — then the issue is with your source code not with the actuall statistical math involved.


KhAoZ June 3, 2008

I assure you Jim Nayzium, I have read all the comments. Maybe you should also take the time to fully read mine? If you did, you would have realized that I actually have discussed the previous findings in these posts.

“You have failed to realize that we are only ever in the scenario where the last two cases have the main prize a very few times out of a hundred.”

Yes that is very true, we are in that scenario only a very few times out of a 100, which is why my analysis is not going to necessarily help you in the game. What you have to understand, is that I’m limiting my analysis to only the cases when the main prize _is_ left out (however unlikely that may be). And in those very few cases, I go on to prove that it is actual beneficial to switch.

The thing is, I also built the applet to allow you to play the game interactively, and see _why_ it happens. Its very easy too see why its beneficial to switch when you play a few games. I’ve now built the Jar file for my program so you can open it by double clicking it. All you need to do is make sure java is installed:

And remember, you can download and verify the source here:

Tell me if you still have any problems. And if you aren’t willing to read through all I have to say and actually get my applet working, please don’t bother commenting.


Jim Nayzium June 4, 2008

on my mac the .jar file does nothing but tell me that it won’t open and to check the console for more messages. I am not sure if I have Java running natively. Fairly certain it runs within a browser…can you post it online somewhere — I would be happy to take a look.

And again — if all we are ever talking about is the sub-set of times it gets narrowed down to two items — and you are told ahead of time it will be so — then this changes everything.

The random picking actually change the odds.

The way you have ‘changed’ the rules of the game is the same exact math as saying — to two players — one gets to pick one case — the other player gets to pick 25 cases….

then before revealing which stack of cases has the million you offer the one case person the opportunity to switch.

This, of course, would always be advisable to switch at this point.

HOWEVER — this entire explanation is not relevant to DOND math at all — in anyway shape or form.

It’s like saying since Jeff Gordon can drive a car really fast maybe he would be a good person to fly the space-shuttle.


KhAoZ June 4, 2008

Jim Nayzium –

I actually have posted the applet online here. If you scroll down to the bottom, you should see it, and instructions on how to use it.

But I believe you mentioned it did not load, so I have no idea what the problem is. If you can show me the contents of the console, it might help me resolve the problem.

Anyways, I’m still not sure if were on the same page here for my explanation, so let me go at it one more time.

The applet really does help in explaining, but I will try and show you what I mean without it.

Lets say our DOND game has 5 prizes.


At the beginning, you pick a random prize and keep it until the end (when you can make the switch). At this point, there’s equal chances for all the prizes to be picked. Particularly, each prize has a 1/5 chance of being picked. That means, there is a 1/5 chance of picking the top prize (the 1000 prize).

Then, after you pick your prize, you get really lucky, and eliminate all the prizes except the top prize and one other prize. This is the important condition that will always happen.

Now, the question is, do you switch your prize? Well, the only time you would not want to switch, is when you have the top prize already. But as we discussed previously, there’s only a 1/5 chance of that happening. But whats really important, and what I’m thinking a lot of people are missing, is that the elimination part ALWAYS leaves out the top prize. As in, that is my _condition_. Thus, since there’s only a 1/5 chance of having the top prize, and the top prize IS OUT THERE, then its obvious you’d want to switch.

Obviously, you’d still have to be pretty lucky to not accidentally eliminate the top prize. In fact, if we take this into consideration, the game actually does balance out.

There are 4 prizes left after you pick yours. Thus, the chances of eliminating 3 of them except the top prize are:
3/4 * 2/3 * 1/2 = 6/24 = 1/4

So, normally, the chances of you picking the top prize is: 1/5 (as always)

And the chances of you not picking the top prize, yet have it still remaining, is
4/5 * 1/4 = 1/5

So normally there’s no difference if you switch or not. But, if you are lucky, and you still have the million remaining, the best option is to switch.

This still does have applications in DOND. It helps you make a better decision if you end up having the million dollar prize left, however unlikely that may be anyways.


Jim Nayzium June 4, 2008

The whole tone of my post is going to sound mean — so please don’t take it that way. Without inflections and tones the internet leaves a lot to be desired in the way of pure-argumentation etc…

OK — Here is what I think / know.

That after we do get on the same page your statistics listed above will not hold up.

I will point out a few things first and then go into more detail —

First example of stat-math being flawed…
And the chances of you not picking the top prize, yet have it still remaining, is
4/5 * 1/4 = 1/5
The chances of you NOT picking the top prize in your example are not calculated this way.
They would actually be –
4/5 * 3/4 * 2/3 = 24 / 60
I will not go into more detail on that above — but do the research on any poker-odds site on how to calculate odds.

NOW — Do me a favor —
Rewrite the game with MY conditions….

Instead of one person playing have two play.
5 cases as you outlined.

Have person one and two’s case assigned randomly by the program.

The insert your condition of the game ALWAYS having to come down to those two cases having the big-prize.

Now run your game a bazeelion times and see which person is more likely to have the big-prize.


Another way to conclude the correct math is to finally realize the simple truth….

Your case was picked at 4 to 1 odds (5 cases = 4 to 1 in Vegas terms)
ALL CASES at the beginning have 4 to 1 money on them if Vegas is laying numbers.

So the fact that it comes down to two cases just makes for interesting television….THE FACT THAT THEY all started with 4 to 1 means when there are two left you have two cases that had equal long odds.

It would not matter if there a million cases and you did the same thing. The last two cases would then be 999,999 to 1.

Two cases to choose from — BOTH with equal odds — means 50/50.

I promise!


Erroneous Geek June 4, 2008

The odds for winning by switching [or not switching] cases is 50-50, but it’s not intuitively obvious. Rather than using formulas and theories, I offer this stripped-down version of the game..
Let’s play with 4 cases total.. [Skeptics say that switching cases will make a $1Mwin in 3/4s of the time, but it’s not true!] Here are the rules for my mini-scenario..
I’ll assign each contestant with a letter: A thru L, because there are only 12 possible combinations. Each contestant will choose a case to keep {I’ll write their choice in brackets next to their letter}, then they’ll open 2 more cases, leaving ONE in the field. I’ve hidden the $1M in case 4!! If the contestant reveals the $1M by opening case 4, I’ll write “OOPS-OUT.”
Contestant A, B, & C choose to keep case 1.
D, E, & F keep case 2.
G, H, & I keep case 3.
J, K, & L keep case 4.
{Ya with me so far?}
A[1] opens 2 & 3, leaving 4 [in the field]
B[1] opens 2 &4, leaving 3 OOPS-OUT
C[1] opens 3 & 4, leaving 2 OOPS-OUT
D[2] opens 1 & 3, leaving 4
E[2] opens 1 & 4, leaving 3 OOPS-OUT
F[2] opens 3 & 4, leaving 1 OOPS-OUT
G[3] opens 1 & 2, leaving 4
H[3] opens 1 & 4, leaving 2 OOPS-OUT
I[3] opens 2 & 4, leaving 1 OOPS-OUT
—–still with me?—–
J[4] opens 1 & 2, leaving 3
K[4] opens 1 & 3, leaving 2
L[4] opens 2 & 3, leaving 1
Those (above) are ALL the possible combinations.
As can be plainly seen, only contestants A, D, G, J, K, & L are still in the game [and capable of winning the $1M].
So, if they DO NOT switch.. A, D, & G will LOSE and J, K, & L will WIN.
If they DO switch.. A, D, & G will WIN and J, K, & L. will LOSE.
Now scratch your skeptical heads and chant, “Holy-crap.. that’s 50-50 odds!!”
This mini-game may be repeated with every other case hiding the $1M, but the results will be the same, of course.


KhAoZ June 4, 2008

Jim Nayzium –

Yes sorry, my mistake in the probabilities. Just a simple mistake, forgot were supposed to leave 2 prizes out there instead of one.

Also, not sure what revisions you want in the program. Is one person supposed to be always switching or something?

Anyways, If you guys don’t want theorems or math, here’s an example game. Even though I used my program to do this, its really only generating the log. It is doing no simulating whatsoever. I could have done this with pen and paper:

*Note: I’m choosing boxes randomly myself, but their already mixed periodically. If I try and choose a locked box, I get an error, which is simulating the chosen path we talked about before.

*There are 5 prizes: 1, 100, 200, 500, 1000

Welcome to Deal or No Deal Statistical Tester
Box #1 is yours.
Box #2 eliminated: 500
Box #3 eliminated: 100
ERROR: You have tried to choose a locked prize.
Box #5 eliminated: 1
Choosing rounds are over, 2 remaining boxes.
You chose the 200 prize.
The remaining box is: 1000
The best move is to switch boxes.

Box #5 is yours.
Box #2 eliminated: 200
Box #1 eliminated: 100
ERROR: You have tried to choose a locked prize.
Box #4 eliminated: 500
Choosing rounds are over, 2 remaining boxes.
You chose the 1 prize.
The remaining box is: 1000
The best move is to switch boxes.

Box #2 is yours.
Box #5 eliminated: 1
Box #4 eliminated: 100
Box #3 eliminated: 200
Choosing rounds are over, 2 remaining boxes.
You chose the 1000 prize.
The remaining box is: 500
The best move is to keep your box.

Box #5 is yours.
ERROR: You have tried to choose a locked prize.
Box #1 eliminated: 200
Box #3 eliminated: 500
Box #4 eliminated: 100
Choosing rounds are over, 2 remaining boxes.
You chose the 1 prize.
The remaining box is: 1000
The best move is to switch boxes.

Box #2 is yours.
ERROR: You have tried to choose a locked prize.
Box #5 eliminated: 1
Box #3 eliminated: 200
Box #4 eliminated: 500
Choosing rounds are over, 2 remaining boxes.
You chose the 100 prize.
The remaining box is: 1000
The best move is to switch boxes.

Box #1 is yours.
ERROR: You have tried to choose a locked prize.
Box #5 eliminated: 1
Box #4 eliminated: 500
Box #2 eliminated: 100
Choosing rounds are over, 2 remaining boxes.
You chose the 200 prize.
The remaining box is: 1000
The best move is to switch boxes.

Box #5 is yours.
Box #3 eliminated: 500
ERROR: You have tried to choose a locked prize.
Box #4 eliminated: 100
Box #1 eliminated: 200
Choosing rounds are over, 2 remaining boxes.
You chose the 1 prize.
The remaining box is: 1000
The best move is to switch boxes.

Overall, 7 games played, 6 switches, ~85% is better to switch

As you can see, unless I chose the top prize at first, the best move was to switch every time. Given its a very low chance to pick the top prize at first, a switch was almost always beneficial.

You can argue with my math, but you cannot argue with a real game played.


Jim Nayzium June 4, 2008

hahahaha — this is very funny to me in general.

THERE are so many variables with your game/code/hypothesis in general it just strikes me funny now.

You are claiming “REAL” results on a small sampling you know.


with only 5 boxes — if the Monty Hall principle were in play here you would only be increasing your odds a little bit you realize right?

Meaning you go from seemingly 4 to 1 to a refreshed 1 to 1 —

so you go from 20 percent of the time winning to 50 percent of the time winning…

RUN your game analysis over again with a million boxes if you can insert that in to the code….

Then you would go from one out of a million to 50-50 and the difference should be obvious….

AND — also by the way — if the difference IS indeed obvious — it would then be conclusive that your code / game is flawed….

BECAUSE IT WILL BE FIFTY FIFTY – give or take a decimal point — even when switching if you chose to use the math used by the likes of

Euclid, Newton, Einstein and others —– meaning the math where

24/60 does not equal one last box….

That was not a small error in your calculation — it was using the WRONG method to accidentally get close….

In order to calculate the odds of recurrent events omitting something from happening you have to calculate the odds of it happening and then use the reverse—

Meaning…the odds of picking the NON-million box in your game are

4 / 5

then once you haven’t picked it —

the odds are now

3 / 4


2 / 3

and then finally

1 / 2

ALL four of these odds be beaten for that to occur — THIS WILL NOT happen 1 out of five times….

SERIOUSLY — using real math actually matters in this one…i promise…

You imply that the math will be one fifth of the time you get down to the one box — WHICH IS CORRECT – coincidentally…

because the product of

.80 x .75 x .66 x .50 == will be —–

.8 x .75 = .60

.6 x .66 = .398

.398 x .5 = .198ish…

I am not really saying your error in the is causing your issue with your game, I am implying that if you used fake-math to get one conclusion then you are willing to use fake-math to get the other conclusions….

I would be willing to bet you in person 1 million dollars versus your 38 dollar Outback Steakhouse GiftCertificate that 2 stanford mathematicians could sit in a room for 10 minutes and conclude this was 50 – 50 every time that it gets down to two cases….

Unless of coures _YOUR_CONDITTION_ is that they get to use fun-math instead REAL MATH!


KhAoZ June 4, 2008

Man you are a complete moron, I’ve tried to explain this to you like 3 times now, yet you completely fail to comprehend it. And its so god damn simple.

My “condition” has nothing to do with “using fake math”. Its simply restricting the cases. The condition is, the million is always out there. That means, when your down to two cases, you either have the million or the other prize is a million.

Now common, how much further can I simplify this?

If you pick the million, you will not want to switch.
If you pick another prize, you will want to switch.

No matter what. Period. If you only use the cases where the million dollar prize is one of the last two prizes.

Thus, if there’s 25/26 chance of not picking the top prize, then most of the god damn time you’ll want to switch your prize. PERIOD. There’s no more odds at this point. Its either you pick the million, or you god damn don’t. Your decision will be the same EVERY TIME.

If you just simulate a DOND game with no prize restrictions (Ie, looking at all possible 2-prize cases) then obviously its going to be 50:50.

And for fucks sake man, the game example I gave you has nothing to do with my program. Just pretend I wrote it all out by hand. I can give you a million games if you want, it will result in the same statistics. There’s no “coding error” or anything, its just me simply playing a DOND game in my head, where the top prize is always there in the end.

I’m not even going to bother trying to further explain it to you, given this is so friggin trivial. It doesn’t even have any applications in DOND, since the chances of the million dollar case staying is small anyway. Its merely a response to the previous 1cent and million prize discussions before, which were 50:50.


Com2 June 5, 2008

How hard is it? Out of 26 cases the odds of 1mil being in any one case is 1 in 26. If you eliminate 24 cases with out finding the million then it is a 1 in 2 odds or 50/50, no matter where the case is, the case doesn’t care, the odds won’t either.


Jim Nayzium June 5, 2008

KhAoz — I am sorry to have misunderstood this whole time that you actually knew your game had no relevance to the actual odds of DOND.

You are correct in saying that I missed this point.

So you are basically saying that the Monty Hall game you should switch cases — and the DOND game you don’t increase your chances of winning by switching.

Are we agreed upon that?


KhAoZ June 5, 2008

Jim Nayzium –

Ok, I was discussing this with a friend, and he showed me a different perspective that has made me realize your arguments a little better.

When I did test games to see whether switches were beneficial or not, I essentially forced each case to leave the million dollar prize not eliminated. Like, for example, the posts above where I showed you that example series of games, whenever I tried to pick a locked box (Ie, through random picking), I essentially invalidated that move, and continued to eliminate other prizes.

In retrospect, that does seem to be cheating. I then tried to do the same simulation, except whenever I accidentally picked a locked box, I basically invalidated that case altogether, and started a new game. And, coincidentally, this results in a 50% switches overall.

I’m not sure if this was what you were trying to point out the entire time, but either way, you were right. I mean, I knew this would have little potential in DOND anyways, since most of the time, the million dollar prize will not be there anyway. What I didn’t realize immediately, was that forcing the million to stay in every case would mess everything up.

Either way, I’m sure we could have had a more polite discussion then what resulted, but heh, you know the Internet. Sorry for any rude remarks I made.


Jim Nayzium June 5, 2008

That is what I meant by the inflection will sound mean.

So you have made internet history from my perspective and totally come full-circle from ranting butt — to cool guy.

100 percent respect you for trying so hard to support your side of it!

The only reason I was so sure of the result was that I was you eight months ago and utlimately lost a 50 dollar bet I made at a party that I didn’t pay for four months! because it took that long to come around.

So you beat me by three months and 28 days!!


If you ever need anything that we can help you with at
let us know! –= it’s my company!


Fred June 8, 2008

If you think of the problem as dividing the show into two groups, it’s hard to counter this concept:

Group A (you) has a 1 in 26 chance of having the million.
Group B (the ladies) have a 25 in 26 chance of having the million.

Without being allowed to choose again, the odds for you shouldn’t change. When you get to the final two cases, with the million still in play, it seems that the logical thing to do is switch.

However, like others I wrote a program to play the game and see what really happens. My results:

After 500,000 games I got the following:

The game ends early, meaning the million gets picked before it gets down to the last two cases – 461634 times. (92.3268%)

The contestant has the million – 19672 times. (3.9344%)

The million is with the ladies – 18694 times. (3.7388%)

Unless I’ve made an error in my code, it seems definitive that it doesn’t improve the odds to switch at the end of the game. These numbers seem to fit, since the actual odds of the player picking the million are 1 in 26, or 3.84615% of the time.


Jim Nayzium June 8, 2008

This explanation is quite clear and concise to me. However, the two group concept deal misses on large point.

While it is VERY true that your odds of having the million NEVER get better than the original 1 out of 26 — the actual thing that DOES change your odds of winning at any given time are the odds on the cases left having the million.

When there is only one other case left in the game and the million is still there — that means there are two cases that originally had a 1 out of 26 chance to win — since two cases each have the same long-odds — this now means that the current chances of winning are 50 – 50 .

Similarly, when there are three total cases left and the million is still out — EACH case had a 1 out of 26 chance —

so your case is 1 out of 3 to have the million now — because each case had equal long odds.


Doc June 26, 2008

Its a shame that the “values” aren’t consistant. During the game the banker (shrewdly?) discounts his offers, especially when the consistant no longer has a safety net. Then, in the make believe phase, the “and the banker would have offered” amounts are inflated as much as they are deflated, when the banker has the contestant over a barrel.

All said, great show. One more reason for kids to take math.


BigBooper July 20, 2008

Good to see we have peace here. For all the fans here, what did you think of the 3 foreign versions shown in May during the sweeps period, in terms of production values, host, etc. And how would you rate them overall. Talking about Filipino, Estonia and South Africa.


DavivMcD July 23, 2008

Just to add to the (resolved) discussion relating the end choice to the Monty Hall problem:

The only reason the Monty Hall problem works the way it does is because it is NOT random. It is given that Monty will always open a door with a goat (that’s the way I learned it, your ‘junk’ prize may vary =P); he will make a conscious decision not to reveal the car (same disclaimer).

I have actually developed a proof by induction that, given any number of goats (g), and any number of cars (c), switching is still the best choice no matter how many doors there are at the beginning, so long as g > c. Since there is only 1 top prize, DOND seems to fit those parameters. However, the choices in DOND are truly random, and the Monty Hall case becomes irrelevant. Even in the rare case of ending up with the $1mil still on the board for the last 2 boxes, switching gains you no statistical advantage. You’re not comparing the chance you originally picked the $1mil (1/26) to the chance you didn’t originally pick the $1mil (25/26), you are actually comparing it to the chance you originally picked that specific value still left on the board (1/26). 1/26 vs. 1/26 = 50/50 odds.

Sorry for rehashing what others have already hashed to death, but I thought I would try to offer an explanation that makes sense intuitively, rather than using brute mathematical force.


andrew b July 24, 2008

I’d like to add a little extra thing to this discussion.

This will work with any amount on the board, AT ALL.

The original chance of 1/26 of any amount being in a case must be DIVIDED by the continued luck that that amount has not appeared yet.

The original chance is 1/26 that a specific amount is in, say, case 4. Now, let’s pick case 15. If that amount is in case 15, it isn’t in case 4, and the odds become 1/26 / (1/0) = 1/26 * 0 = 0.

Any other amount (25/26), and the chances become 1/26 / 25/26 = 1/26 * 26/25 = 1/25.

And so on down the line with each case chosen.


BigBooper September 1, 2008

This thread seems to have gone inactive. Anyway, good to see the contestant won the million tonite. Only wish NBC hadn’t promoed it, took all the suspense out. One thing that kept her going tonite, the Banker was lowballing the offers all the way thru the show because of the sheer cliff after 4 of the 5 million cases went quickly.

2 weeks in a row, they went down to 2 cases.


Josh K. October 14, 2008

i have to do a research math paper on something called Monty’s Deliemma and i was hoping if someone really knew the math and statistics it would help me out i have been going to like 20 different websites for data and info and couldn’t find any good info. if anyone knows the math can they please tell me?


george October 15, 2008

josh k. the math is easy


Josh K. October 15, 2008

the whole math? not for me. actually nvm im gonna change around the rules of deal or no deal for my project im gonna make 25 cases, 1$ and 1 case, 1 million dollars


Bob October 21, 2008

@Jim with regard to the probabilities…

Interesting discussion. I have to admit, I’m a geek for calculations. It’s been a little while since I’ve visited them but without beating the issue into the ground I was hoping to add one small thing…

If you ignore the probability of getting down to two cases where one case has the mil and the other one doesn’t, the 50/50 probability holds. Since there are only two cases left, there is a 1/2 chance that one has the mil.

However, you can’t ignore the probability of actually realizing that scenario. The probability of _actually_reaching that scenario is quite different and IMHO should be the real focus since it’s a lot smaller than 50/50.

Hence, there are two cases to consider (no pun intended):

1. Pick the winning case
2. Pick 24 non-winning cases (leaving a non-winning case left over)

(1/26 * 24!) / 25! = (24!/26) * (1/25!) = 1/(26 * 25) = 1/650 ~= .1538%

1. Pick a non-winning case
1. Pick 24 more non-winning cases (leaving the winning case left over)

((25/26) * 24!) / 25! = (25 * 24!) / (26 * 25!) = 1/26 ~= 3.8461%

Adding the probabilities we get ~= 4.1599% chance of reaching the last two cases where one has a mil and the other doesn’t.


Bob October 21, 2008

Hah, let me revisit remedial math: .1538 + 3.8461 = 3.9999% :D


Patrick October 21, 2008

Actually there is a 7.69% chance that one of the last cases will have a million.

100/26 = 3.84% chance that you initially chose the million $ case.

3.84% chance that the last case you choose will be the million

3.84 + 3.84 = 7.69% chance that one of those cases will be a million.


Patrick October 21, 2008

This is based on the method of choosing a case to keep, then picking 24 cases without opening them. (if they were opened your odds could change from 0% to 100% depending on whether you opened the million)


? October 22, 2008

but what happens on other DoND versions where in the banker doesn’t use math rather only his whims? like Php50, Php1,000 and Php500,000 is still in play. in the previous round, the player knocks of Php1M so the banker only gives Php100. but yeah ur only talking about the us version! silly me!


Josh K. October 22, 2008

what im doing for my project is relate it too the Monty Hall Paradox but do it in a fashion that you have multiple turns and not just having two or three turns


Danny M. November 27, 2008

okay, so i took all that into consideration, and i used it the whole time while playing it online,

i picked my favorite number in the world… 4
the 1 million dollar case was number 5… i picked all the cases never got to the mean of the dealers offering or banker, i ended up with my case being 750k… yeah i think i wud do well on that show, i’d take up about 5 minutes of thier time, and walk out with 500k or more.


Fred December 22, 2008

It’s been quite a while since I’ve seen the show. Tonight I watched a little of it, just enough to see how it’s turned to crap. The contestant got down to the last 4 cases, top prize being 250,000. Statistics would indicate that her chances of winning are 1 out of 4, equating to a fair offer of about 60k or so. Instead, she was offered 14k. It seems that no one is going to be offered anything that might induce them to quit early, instead they are being coerced by extremely low offers to continue the “game”.

The problem here is that the risk factors for the banker and the contestant are not equal. The contestant is in a subordinate position, wanting to win a substantial amount of money, actually having to take risks to achieve that goal. The banker, on the other hand, really has no risk. He doesn’t have to worry about losing the million, the show makes plenty of money and the odds of a contestant going all the way to a million are pretty slim. Presumably, the thinking is that by effectively forcing the contestant to continue the game by making such low offers the level of excitement goes up and more people will watch. It’s working just the opposite for me, it’s a stacked deck with fake odds.


rp January 2, 2009

I’m wondering how the calculations of the mean compare to the actual offers on the show. I don’t think the offers are strictly a calculation of the mean. I believe there’s another factor in the calculation. What I’m wondering is if anyone has noticed a difference in that factor for different contestants? Does the factor change based on the contestant’s personality, gender or race? It would be interesting to compare the mean calculation with the offers actually presented on the show and compare the results for different contestants. Has anyone done this?


erroneous geek January 3, 2009

The offer is not determined by simply dividing the highest remaining amount by the number of remaining cases. It is calculated using a mean value, tempered by the percent of humidity in Palo Alto, times the log of the number of gray pidgeons stuck frozen to the Statue of Liberty, divided by the reflectivity index of Howie’s left ear-ring.
Ya get what I mean?


BigBooper January 7, 2009

NBC has now realized that the show was overexposed so has cut back on the number of evening episodes. OTOH the daytime version has lost my interest.

The offers on the daytime version are extremely low, compared to the evening version, even allowing for the lower top prize.

The offers on the evening version in the early rounds are higher then they used to be. The later rounds appear to be similar. I use a quick calc of the total of the larger cases remaining (10,000 and over) divided by the total cases remaining and then compare that number to the offer. The % of the actual offer to the arithmetic mean should keep rising in each round but will jump up and down depending on what the contestant has just done and whether or not there is a solid big bloc or a cliff.

I always watch on a DVD recorded copy allowing me to freeze and do the quick mental math.


Slartibartfast January 12, 2009

I assume the game in the US works the same way as it does in the UK. If so it is based on Conditional Probability NOT the Mean Value. Google “Marilyn and the Goats” and you’ll see a similar sort of problem.


George February 3, 2009

Well I read all this but I dont watch the show, never have, never will (as you said; “Howie Who”)?
So Im thinking a bright guy like you who can figure the odds is watching this show must be one of those that cant turn away form the on going train wreck….Stop watching idiotic TV, there are Much better things on (besides, your only dumbing yourself down along with the other half of this country).
Thanks for the rant though. G:o)


Eugene February 3, 2009

394 posts and only one person comments on the beauty of the models. What is wrong with you guys? Some of them are HOT! My stategy would be based on eliminating the models I didn’t like. I think Kent has been the smartest guy of all the responders so far.


PR February 4, 2009

Re David’s post of 7/23/08: Not sure what you did with induction, but it pays to switch no matter what the values of c and g are, i.e., even if c>g. Here’s the proof:
Suppose there are c cars and g goats to start.
Case 1: Do not switch. Then the probability of picking the car is
Case 2: Switch. To get the probability of ending up with a car, we have to consider two subcases. We might have picked a car intially and then switched doors to another car or we might have picked a goat initially and then switched doors to a car. The probability we picked a car initially is c/(c+g). Then, having picked a car, Monty reveals a goat, so when you switch, the chances you have one of the other cars is (c-1)/(c+g-2) because there are c+g-2 doors left to pick from (we have to eliminate the door we initially picked and the door Monty opened to reveal a goat) and of these remaing doors, c-1 are the remaining cars (other than the one we initially picked). So the probability of initially picking a car and then switching to another car after Monty reveals a goat is [c/(c+g)]*[(c-1)/(c+g-2)]. A similar analysis follow for the other subcase giving us the probability of initially picking a goat and then switching to a car after Monty reveals a goat. This probability is [g/(c+g)]*[c/(c+g-2)]. Therefore, the probability of ending up with a car is
[c/(c+g)]*[(c-1)/(c+g-2)] + [g/(c+g)]*[c/(c+g-2)].
Therefore, it will pay to switch as long as the expression for Case 2 is greater than the expression for Case 1. Fooling around with the algebra, this will be the case as long as 2c>c, which is always true.


El Harlo February 6, 2009

Yah, this is how I used to calculate the correct decision when I was watching this show, too. However, I’ve concluded that this isn’t right, for the following reason. The banker nearly always offers far less than the mean of the remaining values. In order to end the game, you have to either take the banker’s offer, or get to the last case. Taking into account both methods of winning, this means that at any stage in the game, the actual mean amount won by players in the same position as the current player is also significantly less than the mean of the remaining values. The conclusion then, is that some calculable percentage of the mean is the correct offer to accept. To find this out, you’d need to find the mean of the actual winnings over the course of the show.


Filmari Nunti March 4, 2009

Very nice post and very nice game indeed :)


D-Dickmann March 31, 2009

Made a 78.356 dollar deal. There was $ 1.000.000 in my box though…


Ravi April 22, 2009

El Harlo, you are correct.
Not only that, I believe they are also misadvertising which is illegal.
Essentially the banker’s offer is always 2/3 to 1/3 of the mean (sometines even lower).

However once the player accepts the amount, the “bank would have offered you” jumps close to mean (misadvertisement).

I just finished watching half million dollar show. The contestant had 500K and 6 cases to go. Banker offered 27K. Player took it. Then she pointed to 2 cases she “would have opened had she said no to the offer”. It knocked out about $600 off the board. The 500k case was still there. Now there were 4 cases left and “the bank would have offered” came to 120K (very close to average of 125K).

Hence mis advertising. Somebody please sue and give me a cut.


Joe May 5, 2009

Hey, just in case anyone is interested, I made a little DOND “Expected Value” calculator that you can use to keep track of what the mean is while watching the show. There are two versions, one for the half hour version and one for the prime time version. Just un-check the numbers that are chosen as they are chosen and the expected value at the top should automatically update.


Joe May 11, 2009

Hey, just in case anyone is interested, I made a little DOND “Expected Value” calculator that you can use to keep track of what the mean is while watching the show. There are two versions, one for the half hour version and one for the prime time version. Just un-check the numbers that are chosen as they are chosen and the expected value at the top should automatically update.


rich May 15, 2009

What people forget to realize is that it’s not a purely statistical decision you’re making. For example let’s say theres 4 cases (3 low, + 400k)… The offer is 85k lets say… According to your statistics its not worth to take the deal. If 85k means a lot to you in real life, it is worth more to you than risking the chance of losing that offer. If I were on the show I would take this deal every time. 25% chance of something happening is NOT that low of an occurence. For me it would have to be atleast 10% chance of NOT getting a high amount for me to continue playing.


Eyal May 23, 2009

you are forgetting two things here (three actully)

first of all, it’s a GAME show, and that means, ppl come to have fun (some come to win, but I think most, come to have fun). taking the banker’s offer, means stopping the fun.

Let me ask you this, say you were told you can take part as a contestent in the show, but all the winnings would go to cherity, how much money would you pay, in order to take part in such a show? if you are willing to pay even 1 dollar, that means, you agree with me, that winning money is not the only goal of a contestant in that show.

second, 100,000$ is NOT twice as good as 50,000$. it can be clearly seen in this simple game:

Given a single 50,000$ chip, and a double or nothing slot machine (say double+1, or nothing) would you gamble away the 50K? if you win you get 100,001, if you loss, you get nothing (mean value is alittle above 50K, so statisticly, it’s a “safe” bet). Differant people have differant “value” curves on money.

third (and maybe most importent) he wasn’t betting on holding the best suitcase, she was betting on opening a bad suitcase on her next move.

there was an 80% chance that the next suitcase she opens would hold less than the banker’s offer. that means, there was an 80% chance that the next offer by the banker would be higher. That’s what she was gambling on.


fawad tanumand June 23, 2009

this is fawad tanumand i would like to play this game beacuse i want to win some money to invite my family in canada from Pakistan iam orginally form Afghanistan and now iam living in canada in vancouver i hope that you are aceepting my problams and give chance for me thanks iam waiting for positive answer my cantact number is 604-725-7217


Matthew June 23, 2009

You can’t base your decision solely on the average dollar amount of the cases remaining in play. I tried doing just that in the online game and ended up with a measly $5,000 and change, when I could have settled for over $60,000 earlier in the game. A better strategy is to consider cases $100,000 and higher your “safety net.” If you only have one or two of those cases left on the board and five more cases to open, you’re best off accepting the banker’s current offer. The banker may be taking advantage of you by offering you a low-ball amount, but $60,000 beats $5,000 any day.


Carmen June 29, 2009

One thing I have never seen done … and I believe it would do a world of good for any contestant. Let’s face it, the stage pressure one gets coupled with the mathematical calculations one must make to see if they are in good shape to continue can be overwhelming, so the best thing to do is assume you hold the case with the LOWEST amount still on the board. This would easily make one see if the deal offered is better than most of the cases left to open … usually a good time to stop. Also, another strategy is to simply stop as soon as a 6-digit offer is made … few ever seem to get beyond that without going broke.


James July 10, 2009

Where the hell are all you guys playing this online? I went to the official DOND website and played it and the banker NEVER gave me an offer better than the mean in the 8 times I played. wtf?


Gaurav September 4, 2009

While I agree that most people who play the game play without reason and with emotion, your mathematical reasoning is flawed beyond belief. Please start trading for a wall-street firm and let me take the other side.

Firstly the banker’s offers are in general pretty close (generally above) the expected value. According to your reasoning there should be no difference to playing the following 3 sets of possible outcomes (0, 5, 295), (0, 100, 200) and (50, 100, 150). Arithmetic Mean/Expected Value only gives you one moment … it can’t capture all the characteristics of a distribution. There is a risk premium associated with volatility of outcomes … go read some more.

Secondly … there is a huge difference in strategies depending on whether you can play the game only once or countless times. Going by your logic we are all suckers for paying for getting life insurance (since you always pay more than the expected value of the policy) … but the reason is that you only play once and the insurance company plays the game countless times … hence their strategy (selling ins) is opposite to your strategy (buying ins).

Thirdly outcomes while numerical (and linear) in nature don’t have linear values to people … that function is person dependent.


AgreesWithGaurav September 21, 2009

Gaurav is exactly right. The game is far more complicated than it appears on the surface.

While people love to bet small amounts of money for a mediocre chance to win a large amount, in reality, the first dollar should be worth more than the millionth, (who cares about a dollar when you have a million?) Also, the millionth dollar wil have more taken from it in taxes, further adding to the complication of figuring out a persons real “values” associated with each dollar value.

Also, the number of cases you have to take away in the future will make a difference, (though in general, it seems that since people tend to take the deals towards the end, so not a ton of work needs to be done here. It is clear the last offers are the most important.)

The person in the original example should probably take 55K, less than the mean, but what she is doing by continuing is risking her ~20K-80K
dollars in order to win the 80-300K. though the 60K is less than the 220K, the taxes increase substantially and getting 80K is wayyyy better than getting 20K. (also, 20K was a GIFT)


luke October 6, 2009

ya, i thought the monty hall problem applied to this game; bent my mind around it for about a half hour and realized it doesn’t apply due to the randomness of your choices. the key for it to be a monty hall scenario is nonrandom elimination of junk prizes.


Adrian October 18, 2009

Josh & Chris I read both your arguments above.

I would have to say that if we were excluding morality (which is never the case), Josh would be right, play on your odds. Determine the likely hood of the offer going up or down. But on the other hand, considering the fact that the majority of people who play the game are middle-class, you would have to strongly consider Chris’s more realistic approach and work toward “beating the mean”.


FredS October 20, 2009

“Also, another strategy is to simply stop as soon as a 6-digit offer is made … few ever seem to get beyond that without going broke.”

This has been my strategy, and it has worked virtually without fail every time the contestant gets a 6 figure offer. Of course, I don’t see every show so I have no idea of the average amount that has been won over the course of the program. Does anyone have that info??


Chris November 24, 2009

Watch it carefully. They almost NEVER offer more than E(x). But once the customers take a deal and start picking cases, they’re blantantly lying to them and all of a sudden the offers go way in relation to E(x). The most obvious example I saw was someone was down to two (hypothetical) cases, say $500,000 and $750,000 and they tell they would have been offered $695,000. No they would not have. That’s an outright lie.


Jesus December 15, 2009

Hmmmmmm well you make excellent points. I am Jesus.


FredS December 15, 2009

If you are really Jesus, then you don’t need the money anyway.


andrew b December 16, 2009

“Jesus” is an actual name, oddly enough. It’s usually Hispanic in origin, and is pronounced “Hay-soos”…to avoid blasphemy.


Brandon January 18, 2010

I liked the article. We learned how the game was played and everything your article discussed in class. Upon doing multiple mean calculations it was observed that the banker will generally offer 1 good deal a game with <4 cases.


Jake January 18, 2010

This game is rediculus. You have more of a chance of losing than winning. the only reason to play is because either way you get some cash.


Jake January 18, 2010

The article does have a good point though. Once you get an offer greater than the mean, you should take it because there is nothing telling you of what could happen in the next offer. You could get the probability of whether or not you get a good or bad suitcase to determine your offer but it is still not certain.


Brittany January 18, 2010

Like Brandon, this is homework for our AP stat course. After learning this techinque in Stat I knew where the article was going right from the start. Before it was mentioned in the article I calculated what the expected value was and realized that this lady really was crazy for not accepting the banker’s offer of 80,000. Generally the banker offers less than the expected value but in this case he offered significantly more. Also, the fact that she had an 80% chance that the money value in her case was less than that of the banker’s offer should have been a good indication to her to take the deal. I suppose under pressure people don’t think logically about decisions, and evidently this lady never learned about expected value. Maybe she thought her luck would beat the odds. Either way, the game is exciting its a matter of logic or risk.


Jesse January 19, 2010

“If no cases have been opened, then this value computes to approximately $131,477.54.” I feel as if you’re a winner just by being picked to be on the show. The odds are with you to win money of some sort: 100%. Just by playing the game you are guaranteed to walk out with more money than you walked in with. I’ll take that as a win.

Jake said,”This game is ridiculous. You have more of a chance of losing than winning.” This is only true if your idea of ‘winning’ is winning over the mean

Either way, a contestant should take the deal once the banker’s offer is higher than the expected value. The math is only going to be used toward the end of the game because people want to win the ‘big money’


Tracey January 20, 2010

I really enjoyed the personal commentary throughout the article. My feeling is that it all depends on the goal of the player. Many people feel that any amount is more than they came with so they wil just play through on luck and ignore the statistics in the hopes for walking away with the larger amount. Others will follow the statistics and follow the rule to make sure they walk away with a larger amount because they may have something in mind try wish to purchase. The fact of the matter is.. Most people aren’t up there with a calculator figuring out the expected value! Most people are running on bloodpumping, nerves, luck, and pressure from the crowd!


Jake T January 21, 2010

I liked this article because it shows you a smarter way to look at the game deal or no deal. The lady probably didnt have any statistical support but who knows maybe she didnt need the money that badly and wanted to have some fun and go for the largest amount possible. Then again, maybe she should not have been up there because she failed to look deeper into any strategies in the game and resulted in her getting a way lower number than if she would have took the $80,000. Thanks to this article and stat class I know what to do if I am in that situation.


Living in vol March 17, 2010

Taking any deal just because it is greater than the mean is definitely sound advice. But, there is something much more important to consider, which is your personal level of wealth. Let’s face it, deal or no deal is pure unadulterated gambling. At every stage where the banker is providing you with an offer is basically asking the contestant if he wants to gamble with X amount of money. I’ve written up another up another approach to tackle the game adding some economic theory into the mix…


AJ May 17, 2010

For those interested, in the original scenario the expected value of taking the deal is $80,000 (obviously, it’s guaranteed), whilst the expected value of no deal, after choosing one more case is $61,731. Deal was definitely the best way to go.

And it’s important to remember that you can’t just play these shows by odds alone. The difference between $200K and $0 is a helluva lot more than the difference between $200K and $400K to the average person.


anton May 24, 2010

is the dealers rule in this game to take the mean at the start of the game of $131,477.54, then if at any stage in the game the player is above the arithmetic mean, offer a much lower number to deal and if at any stage in the game the player is below the arithmetic mean, offer them the equivalent of the arithmetic mean or lower ???
What this essentially does is give the player a gut feeling that they are being offered at all stages an unfair deal because they can see on the board the potential upside of carrying on.
When the average someone is on the spot and under pressure, they cant realistically calculate the arithmetic mean until they are down to the last few boxes. But what they can do is work out with gut instinct what they think is fair and what is not.
If they feel like they are being had, then they continue. The point is, the dealer like in any other casino keeps you at the table. The longer you are there the more chance you have of succumbing to the pressure and making the wrong decision / opening the wrong box. His job is to keep the contestant playing as long as possible and offer incentives to keep you playing.


Molly Rose June 10, 2010

Correct me if I’m wrong, but you did not take into account the statistics of the immediate situation after turning down the dealer. So, the stupid lady had her choices, the mean of which was roughly $59,000, so yes, the banker certainly was being generous. However, if she picked anything OTHER than the $300,000, and she had an 80% chance of doing so, her mean would have gone up. If she only needs to pick one case before the next offer, she has an 80% chance of raising the mean above $75,000, and a 60% chance raising it to more than $87,000. The banker would then, most likely, give her a higher offer, if he’s continuing in his generous pattern, but his offer certainly wouldn’t drop much. This is why even smart people often lose this game, because it is easy to rationalize each step of the game.

Although the numbers are never precise, the game often comes down the concept of choosing between 50% of the money and a 50/50 chance at all/nothing.

(I apologize if this has been covered in the comments already; I tried to skim them, but MAN there are a lot!)


Michael Fox August 20, 2010

I’m surprised to hear about that offer. I’ve watched a lot of the show, and I’ve never seen the banker come in with an offer higher than the expected value of the remaining cases. Not one time.


X August 21, 2010

I don’t see why you needed to write a whole article to say “Find the mean of the numbers. If your offer is higher than that, take it.”


Christian November 17, 2010

This is stupid, in the UK the banker never offers more than the mean. It’s supposed to be his job!


Ens November 28, 2010

The arithmetic mean is only sufficient if it’s a foregone conclusion that the banker never, ever offers more than the mean. But that would make this whole derivation pointless!

When the banker can offer more than the mean, then the expected value has to take that into account or else you’re underselling your chances. If the banker sometimes offers more than the mean, and players play with your strategy (using the arithmetic mean as the expected value), then the actual expected value is greater than the arithmetic mean, which means players are using a suboptimal strategy because they are underestimating their expected value. Therefore, you should actually turn down offers that are slightly higher than the mean (but we cannot quantify how much higher they should be).

Now, if the banker was guaranteed to offer less than the mean, then considering the odds of the next offer being better (80% chance of knocking out something lower) is irrelevant because it’s already considered as part of the mean. But if the banker offers greater, then it’s very relevant because there are rational terminal pointers earlier. If the banker offers equal to the mean, then it’s only relevant inasmuch as it sets the terms of a perfectly fair gamble.

The problem is we can’t come to a conclusion because we don’t know how the banker calculates the odds.

That said, money has a diminishing return; there are various tax implications to consider; etc., all of which serve to depress the amount of money it should be worth before accepting. Thus a rational person with perfect information should in fact accept at a little less than the mean.

Aside: that flash game? I came down to 25k and 50k, and the banker offered me 24.75k. That was a real wtf moment. You absolutely should never accept an offer lower than your minimum winning. I ended up with 50k bullshit-dollars because of the digital banker was a moron. Next time it offered me over 300k after the first round, well above the mean.


Dave T December 18, 2010

From a previous post: “You absolutely should never accept an offer lower than your minimum winning.” assuming that he means that “minimum wining” is the arithmetic mean of the remaining dollar values.

Would you like to bet on that?

Let’s restate the process. We start with 26 boxes. Each box contains a unique dollar value from $0.01 to $1,000,000.00. The play starts with the contestant selecting one box. Then the player selects one of the remaining boxes…. blah blah blah…..

Assume now that after 24 of the remaining 25 boxes have been opened. There are now only two unopened boxes. By the tote board there are two remaining dollar values: $0.01 and $1,000,000.00.

My question is: What are the odds that the box you originally selected has the $1,000,000.00 and what are the odds that the remaining box has $1,000,000.00?

Easy… right? The odds for each box are 50-50.


The odds for the remaining unopened box having the million are 25 to 1.

The odds of your box having the million will be 1 in 25.

I’d accept an offer from the banker of quite a bit less than $500K wouldn’t you?


andrew b December 18, 2010

Ahem. Please read the above posts carefully, sir.


Dave T December 18, 2010

andrew b…. I’ve read the above posts carefully and you know what?
Your are absolutely correct. If there are two remaining boxes, the one that was picked first by the contestant and the remaining one of the 25 boxes that were left after the contestant picked the other 24 boxes, and you can tell by the tote board that the million is still unclaimed, the odds of either one of the two remaining boxes having the $1,000,000 are 50/50.

But… Ahem…. IF and ONLY IF Howie takes the two remaining unmarked boxes, walks into a room that contains a person who has not been watching the game, and assuming the boxes are identical, and he then tells that person he has the freedom to pick EITHER of the two box then that person would have a 50/50 chance of winning a million.

On the other hand if that person, who knows the rules of the show, is escorted to the stage and given the choice of picking either the box that the contestant picked first or the remaining box with a good looking woman standing beside it he should ALWAYS pick the box by the woman. The odds of her box remaining box containing the $1,000,000 are 25/26. The odds for the box that the contestant picked out of the original 26 is 1/26.



Shawn December 18, 2010

Dave T: No, not correct. Whether the person knows there were originally 25 boxes or only knows that there are two makes no difference. Once you’re down to two boxes, one of which contains the million and one does not, we have no information about where the million is more likely to be.

If you’d like to test this, you can. Get some playing cards. Take, say, five of them (you could use 26 but it would be tedious). Pick one of your five to be the “million”. Shuffle them. Set one aside (the contestant’s card). Then start picking cards at random and turning them over. If you hit the “million-dollar” card, reshuffle and start over This will happen about 3/5 of the time. About 2/5 of the time you’ll get down to only one card (plus the contestant’s card), and then you have the scenario discussed. See which of the two is the million-dollar card and record it. Repeat this process a few times — say, 10 times. Do you find that the contestant card is the million-dollar card only one time in five? Or is it about half and half?

It’s easier to simulate this on a computer, but informative to do it by hand. Make sure you shuffle well.


Dave T December 18, 2010

Sigh…. you’re correct. Thank you.


Lance Emerson March 5, 2011

Here is a selection algorithm I came up with….
Just a theory with some frequency to be determined…
Good Luck = Knowledge!

Besy Regards,
Lance Emerson


Derek Sharp July 24, 2011

Your statistical analysis of her situation is a little bit flawed. I would submit that with five cases left in the game, (i.e. four remaining that can be opened) her worst case scenario (i.e. the $300K not being in her case) would mean a 75% likelihood of opening a new case which would be lower than the $80,000 offer, and therefore raise the value of the next offer. I would take a 75% chance of increasing my money, too. Those odds are staggeringly in the player’s favour to keep playing in that scenario, and the next case being in the 25% loss category was a tough bit of bad luck.


Abigal March 19, 2012

Hi, I like the way you explained it, I finally understand how it works! Please may I see your coding for the game, thanks a lot.


Cooper April 28, 2012

I am not sure whether someone commented similarly to this earlier on (I didn’t read all of the comments) but there is actually a different, riskier way of looking at the way that you play this game. The “banker” offers you an amount that is similar to the mean with some fluctuation. The chances of the woman on the show picking a box below the offer of $80, 000 was 4 out of 5, or 80%. This means that the probability of her next offer being higher was very high. There is no reason for you to call her an idiot because by your way of thinking, the person will not be able to maximize their profits.


Rikki Morley January 20, 2013

Ok, don’t know if anyone else has written it here yet but I’m from england, and over here where the show originated therefore me having many many years of knowledge of it, the banker quite simply does not play a game of mean averages, he has never once gone above the mean. For instance one person was left with the jackpot 250 grand, and a mere 5 grand. The mean here being about 130 grand, the banker offered her 40 grand. He is far more ‘notorious’ that the US one and is very ungenerous and unforgiving. The simple idea to him being people in general do not wish to risk 40grand for the hope 250 grand as 40 is so big, it happens on that occasion she risked it and became our second jackpot winner, but I don’t play it by mean, I play it by quit when the money is enough to do with what you wish, or what makes you happy, because then no matter what, you cannot have any regrets :)


Kyle Mart February 14, 2013

Due to the fact that the banker’s offer is likely to go up after rejecting the deal distorts all the probabilities of just looking at the mean. By saying no deal in the situation she rejects 80,000 which is 9,700 more than 70,300. But given that she succeeds in not picking the 300,000, her next offer might have been even higher. For example in the situation where there is $100, $400, $1000, $50,000, and $300,000 say she said no deal and happened to pick $1000. What would be left would be $100, $400, $50,000 and $300,000. now the average is $87,625. Now the banker might decide to offer $122,675. As you can see the new offer is higher than the previous offer and 40% greater than the now current mean. By saying no deal she has opened herself up to higher EV. This is of course if she assumes that the banker will increase the next offer by a larger amount than the next mean.

To make this make more sense say the banker would offer $120,000 next no matter what case is revealed unless she picks $300,000 then the banker offers $21,000. In EV terms she has the following given her two choices:

No Deal now and accept next immediate offer: (4/5)*120,000 + (1/5)*21,000 = $100,200

Accept Deal: $80,000

As you can see $100,200 > than $80,000 therefore there is reason in saying no deal.

Even after she says no deal there is still the option of saying no deal again on further rounds that might include more +EV situations.

P.S. Also note that I am not saying that picking “No Deal” is the correct decision, I am just saying there is a mathematical rational for rejecting the offer.


Deal or No Deal Lover June 14, 2013

I have a winning strategy.
1) Calculate the odds of getting a better offer by counting the number of briefcases that, if eliminated, would increase the average value of the briefcases.
2) Calculate the odds that the briefcase you hold does not contain more money than the banker’s offering.

If either odds are too low, don’t accept the banker’s offer.


Joseph B May 27, 2014

This is an amazing discussion and I stayed up way too late reading through it. Even though it’s long dead, I wanted to add a few thinking points for the next guy or girl who might come along thinking the Monty Hall Problem applies.

Point 1: Why does the million dollar case get special treatment? In any proposed scenario where the million dollar case seems more likely to be in a particular place than any other non-eliminated case, it’s prominence in that scenario has clearly been artificially inflated. To test this, pretend you’re hoping for a different case and run the same scenario. The new “target” case will now be especially prominent. How can that be? Just by hoping for it, it now appears? Very suspect…

Point 2 (which might explain point 1): The odds that the contestant picked the “target” case are a solid 1/26 because in that scenario, the “target” case is GUARANTEED to make it to the final 2 (assuming, as we are, that no deal is to be made). The odds that the “target” case is on the other side, though much higher to start with, aren’t nearly so solid. They start at 25/26 but in many scenarios, the “target” case is ELIMINATED before reaching the final 2. These instances cannot be discarded or ignored, they must be factored in to the odds. I believe all such instances would appear as 0/26 on the other side’s scorecard. After crunching the numbers, these unfavorable “elimination” scenarios drag the other side’s odds down to… 1/26.

After writing that out, It doesn’t actually seem to offer anything new. Perhaps it’s at least a slightly different perspective though. Getting this to ‘click’ seems to take different thinking for different folks.


Reading through, it seems like some possibly great and certainly interesting minds have come before me. I feel a strange sort of honored to be a part of it, however belated and irrelevant my part is.
*deep thoughts*

Anyway, 6AM greetings from Australia! (where the top prize is $200,000, FYI)


David December 11, 2014

Great commentary! I am a math novice, but capable. What would be the best position to be in when competing in the final spin of the game show, The Price is Right”, 1st, 2nd, or 3rd? I would assume 3rd, is this correct, to take into consideration the predetermined amount you have to beat of the first 2 contestants? Am I missing something?


Betty March 12, 2015

Impressive analytical skills. @David, from my assessment, the chances are more equal than they differ. Using your conviction, pick one and stick with it..


Vivian March 13, 2015

I won the million! I had the 1 dollar case and the 1,000,000 case, and I stayed with my case and won a million :)


DJ June 29, 2017

Mostly right but slightly flawed. Does not take into account likelihood bankers offer will increase after next case opening and/or standard deviation. Sharpe ratio is better metric


bill soriano January 21, 2018

I play this game at home with my grand daughter and the only advice I give is when I am down to just 2 suit cases and one has a lot of money and one has a little bit of money I never trade in my suitcase , I always keep my original suitcase and I always come out on top


Hoot and/or Holler

Previous post:

Next post: