Urn A contains exactly 50 red balls and 50 black balls.This is one instantiation of the Ellsberg paradox. A moment’s thought should show that your probability on drawing red should be 50% (1:1 odds) in both cases. But we see that there is an intuitive tendency to prefer the wager in which the “dynamics” of the problem are known (Ticket A).
Urn B contains 100 red or black balls, but you don’t know the relative quantities. It might be 50-50, 0 red and 100 black, 100 red and 0 black, or anything in between.
You are offered two tickets. Ticket A pays $100 if a red is drawn from Urn A. Ticket B pays $100 if a red is drawn from Urn B. Which would you be willing to pay the most for?
- You would feel stupid if you chose Ticket B (the unknown urn) and it turned out that there were only black balls in Urn B. You have a strong preference not to feel stupid, so you’re willing to pay more for Urn A, which is guaranteed to be a “fair” urn. [2]
- You have been conditioned to think of a probability as a property of a situation, rather than a property of an epistemic state. (This is encouraged by our conventions of language, as in “the probability of rain tomorrow is 20%,” which uses the definite article “the” and thus implies a single uniquely correct value of probability, independent of what anybody knows about it. I prefer to phrase these things as “I give odds of 4:1 against rain tomorrow.”) For this reason, you feel that you do not know the probabilities for Urn B, and so you do not wish to bet on it.
- You are wary of being tricked by whoever is holding the draw into betting on a lame horse. An urn with unknown quantities of red and black seems like a potential trick.
If I am right, then in my opinion only one of these objections is defensible (number 3). But they do seem to me to do a half-decent job of explaining away this intuition.
“I don’t know. How long is a piece of string?”
“What about Laplace’s rule of succession? (s+1)/(n+2). Defining success as toothpaste and non-success as non-toothpaste, we get (2+1)/(3+2)=3/5 probability of toothpaste.”
“Yeah, I’m not sure that that is even applic - SQUIRREL!”
“Pay attention! How much would you be willing to pay for a ticket that paid out $10 if toothpaste was drawn?”
“I dunno, maybe I’d give $2.”
“Okay, so that implies your odds are 4:1 against toothpaste.”
“I think that reflects the triumph of curiosity over thrift, more than it does any real probability judgment. I would not pay $200 for a $1000 ticket. Or maybe I would.”
“Look, just answer this: how likely is toothpaste? You can see that 2 out of 3 things pulled out of the bag have been toothpaste. That is evidence that toothpaste is common in the bag.”
“If you say so. Do we even know that this is a random draw? Maybe the guy draws whatever he wants to, and that depends on how I bet. Why is he even performing this draw? He’s probably trying to trick me.”
“Trick you into betting for toothpaste, or against toothpaste? By the way, what’s wrong with the Laplace’s rule approach, again?”
“I don’t know. Please go boil your head.”
As in the described scenario the money I bet would be dream money, I'd bet everything I had in my dream pocket. I know it's a dream because the jellyfish the guy conjured from the sack was alive... ;-)
ReplyDeleteCheers
Chris
The best bet is not to bet at all!
ReplyDelete=
Unless the bet is a Dutch book! (See Elga's paper.)
DeleteUnless the bet is absolute.
DeleteProbability has lead science terribly astray.
They are so lost in their own self-made uncertainty that only the light of truth will show them the Way. But where is that truth philosophy, have you found it yet?
Mankind has been waiting since Socrates and the boys, over 2000 years.
Truth anyone?
Its time to straighten out not only science but also the grey area of justice, philosophy, as well as the multitudes of the faithful.
Truth or absolute is much more simple than thought,
Perhaps One day you will allow me to show you the Way.
I'll bet the Universe on it!
=
My point above is that the way the problems in the post are phrased, IMO the solutions are obvious. The toothpaste/jellyfish situation is impossible (neither empirical nor logical uncertainty), whereas in the Ellsberg's paradox case the reasonable real world assumption is your #3 - i.e. you have nothing to gain if the test is fair (both are 50%) but you lose if it's con, so the rational choice is Urn 1. (In a con, the sum of probabilities actually is <1.)
ReplyDeleteIf you rephrase the bearded guy scenario to something that is actually possible, it becomes a variant of Bilbo's riddle.
>...in the Ellsberg's paradox case the reasonable real world assumption is your #3 - i.e. you have nothing to gain if the test is fair (both are 50%) but you lose if it's con, so the rational choice is Urn 1.
DeleteThe thing that bugs me about the "it's a con!" answer is that you don't know *in which direction* it's a con - black or red. So the situation is more or less the same as the 50%-50% draw.
Obviously the con is always in the direction that makes you lose (*)
DeleteYou wrote:
"Suppose we keep everything the same, but Tickets A and B pay out if a black ball is drawn. Symmetrically, Ticket A would still be preferred to Ticket B, since Ticket A is less “vague.” But then your inferred probability for “red from Urn B” must be less than “red from Urn A,” while at the same time, “black from Urn B” is less than “black from Urn A.” Since the probabilities from Urn A must sum to 100%, this means your probabilities from Urn B are summing to less than 100%."
But this is an error: The scenario where red pays off lies in a different probability space than the one where black pays off, and probabilities from different probability spaces don't have to add up to 1.
(* In principle, it is also possible that the con is in your favor. For example if you know that the people running the lottery have an interest in making you win, it would be rational for you to prefer urn B. But barring such prior knowledge, the scenario where the con is against you dominates)
An "empirical" approach to providing solutions to problems like this is to write a simulation program (in this case of random bearded men with a sack), aka a Monte Carlo Method. This is done, for example, for the Monty Hall Paradox. Then see what data comes out. To be cool, use a real random-number generator (e.g. www.fourmilab.ch/hotbits/).
ReplyDeleteYou can't write a simulation unless you know the rules (logical certainty). In neither the sack nor the urn examples do we know the rules.
DeleteYou might say that we know the rules in the urn example, but Ian didn't specify how the population of red or black balls is chosen - he just said there is some number of red and some number of black. We don't get to assume that any possible mix is equally likely.
You can't write a simulation unless you know the rules (logical certainty).
DeleteIn both the sack and urn examples we are ignorant of the rules.
If you're assuming that any mix of red and black balls is equally likely, then that's an assumption that is not specified in the problem. The process by which the mix is determined is unknown.
Ian,
ReplyDeleteVery good article. I've run into the Ellsberg Paradox before. The Jellyfish was new to me and I think you summed it up nicely. I particularly liked the comment, "Why is he even performing this draw? He’s probably trying to trick me.” -- which never get discussed much during proposition bets. Much of the Ellsberg paradox has to do with people preferring a "sure thing" vs uncertainty (e.g. preferring $5 for certain over a 50-50 chance at $10). I think this is related to trust. If you were to offer people a $20 bill in exchange for a $10 bill, I think most people would at least hesitate -- thinking that you must be up to something (possibly giving them a counterfeit bill, or just trying to get them to bring out their wallet so that you can grab it and run away). But the "trust" issue is never discussed much.
Also not discussed much is the utility of the dollar/ruination: would you be willing to gamble with Bill Gates at $100,00 per trial -- even if Gates gave you favorable odds?
Regarding your note #2, ("to whom is it unfair?") wouldn't the answer depend upon the method used to load the names? For example, if the urn was loaded in alphabetical order of last name with the Zs on top, and the urn was insufficiently shaken, then it would be unfair to people with last name beginning in "Z".
But I get the point: if you don't know anything about the bias (in which direction was the probability tilted) then a randomly directed bias should be considered as "fair" as an non-biased draw.
>Much of the Ellsberg paradox has to do with people preferring a "sure thing" vs uncertainty (e.g. preferring $5 for certain over a 50-50 chance at $10). I think this is related to trust.
DeleteGood point. You might be right that it's related, in part, to trust - but if you read Kahneman you'll see that risk-aversion generalizes to essentially all human decision making, whether or not it involves interactions with others (and hence trust). For example, I am very risk averse in my fiction reading - I usually stick to just a few good authors because it annoys me so much when I spend time on a mediocre book.
However, risk aversion may be a matter of a heuristic that evolved for reasons to do with trust, and was then more universally applied... I'd have to think about it more.
>Also not discussed much is the utility of the dollar/ruination: would you be willing to gamble with Bill Gates at $100,000 per trial -- even if Gates gave you favorable odds?
Right, absolutely. As you say, accounting for the marginal utility of money solves that problem. On the other hand, I do think that humans are inherently just too risk-averse, even accounting for marginal utility of dollars. Kahneman gives the example of a bet on a coin flip of $100 at 2:1 odds, which most people pass up (unless it is repeated). He recommends the technique of reframing each bet as one item in a "portfolio" of investments, which works well for me.
Ian ~ When you say "read Kahneman", are you talking about the book "Thinking Fast and Slow", or some paper?
DeleteHaven't had a chance yet to read Frank Ramsey -- but it's on my list!
BTW, thanks for mentioning "Laplace’s rule of succession". I had been trying (unsuccessfully) to remember that in connection with another thread regarding extinction/how long things last -- it is what this old feeble brain was trying to recall!
I think this is all in "Thinking Fast and Slow," although I'm sure there are other papers containing it as well.
DeleteHi Ian,
ReplyDeleteYes, this seems satisfying!
I enjoyed your article and agreed with pretty much everything you said. Before reading your alternatives on how to resolve the paradox, I had pretty much decided on your explanation [3] so was disappointed to see that your inclusion of it had taken away any opportunity for me to add anything to the conversation.
If I had to nitpick, I'd say that explanation [3] is just a special case of explanation [2]. You don't know the probabilities for Urn B because you don't know the process by which the balls were chosen, and it is possible that this process was designed to cheat you.
Thanks, glad this approach works for you!
Delete>If I had to nitpick, I'd say that explanation [3] is just a special case of explanation [2]. You don't know the probabilities for Urn B because you don't know the process by which the balls were chosen, and it is possible that this process was designed to cheat you.
Ah, but in the case of explanation [2] (without cheating), your ignorance about the process is *symmetrical* with respect to how many balls there are of each colour.
To see this, suppose I outright *told* you that Urn B was filled with either 100% black or 100% red. It still seems to me that Ticket A is exactly as good as Ticket B.
>I outright *told* you that Urn B was filled with either 100% black or 100% red. It still seems to me that Ticket A is exactly as good as Ticket B.<
DeleteMy point is that without knowing the underlying probabilities of either alternative, you are not justified in assuming that either is equally likely. It may be that black balls are more common than red ones or that the distribution is skewed for other reasons.
Suppose I outright told you that I was holding behind my back either a baby velociraptor or a stuffed toy, and offered to let you bet me that it was the velociraptor?
[3] is a special case of [2] because we don't have the option to choose whether we bet on red or black. Instead we are offered the chance to bet on a red ball. Since by [2] we don't know the underlying rules or probabilities, it is likely that the urn is much more likely to contain black balls because that would be of benefit to the party who set up the bet [3].
@ D. Me:
Delete"If you call a tail a leg, how many legs has a dog? Five? No, calling a tail a leg don't make it a leg." -- Abraham Lincoln
I never liked that quote from L. because we are ASSUMING for purposes of argument that tail can be defined as a leg.
Likewise, in probability problems we are told things. Can you trust them? In mathematical problems, you do -- you assume they are golden. In real life, it is another matter.
So, I think Ian is saying SUPPOSE we can assume with absolute confidence that, for purposes of argument, Urn B DOES contain 100% Red or Black.
Ian is not saying a tail IS a leg, he is asking what would result IF we knew it was a leg.
As you indicate, what someone tells you in real life may or may not be true. That's why I raised the "trust" issue. We need to distinguish between assumptions to be regarded as absolutely true for purposes of argument, and statements made that may or may not be true.
Tom:
DeleteI think you misunderstand my point.
Ian did not say 100% red or 100% black, with equal probability. He said 100% red or 100% black (probability undefined).
You can also assume I'm telling the truth in the velociraptor/stuffed toy analogy. I'm unlikely to be holding a velociraptor, but as long as I have a stuffed toy I didn't lie. In that analogy, we might assume that velociraptor is unlikely because we intuit that the prior probability is low. In contrast, the prior probability for red or black balls is assumed to be equal by Ian, but this may not be the case, especially if the urn-preparer wants to trick us.
D. Me: Point taken. I generally assume equal probability, but that was not explicitly stated.
DeleteYour assumption is pretty reasonable to be honest. I was nitpicking!
DeleteMost of the choices we (as individuals) make are single events - not choices that will be repeated over and over in identical circumstances. Yet probability-talk assumes that some given event, like rolling dice, will be repeated over and over. If so, then long-run frequencies are a good guide as to what the outcome will more often be. But if I'm only going to make a given choice ONCE, then why does what would happen IF I made similar choices often have any relevance at all?
ReplyDelete>Yet probability-talk assumes that some given event, like rolling dice, will be repeated over and over.
ReplyDeleteThat is true according to the frequentist camp in philosophy of probability, but not according to e.g., the Bayesian camp (to which I am sympathetic). Most of us are perfectly happy to talk about the probabilities of one-off events.
I'm not sure I agree about the Ellsberg "paradox" being a paradox. The typical behavior might turn out to be rational. In 1956, J.L. Kelly developed the Kelly criterion, which provided a way to calculate the percentage of your bankroll that you should bet to maximize return in the long run without going bankrupt (it assumes infinitely divisible currency, as a technicality). Say the tickets were selling for $49 each, which gives you a $1 edge (if the edge is 0, there is no point in playing, and if the edge is negative, you should take the other side if that choice is available -- buying put options, for example, if you translate the problem to the stock market). And say you can buy not just one, but as many tickets as you wish. The Kelly criterion would tell you how many tickets to buy, based on how much money you have, for the first urn. Play that same percentage of your bankroll (adjusting the amount of the bet up or down as your bankroll rises and falls) will give you a maximum return over multiple plays. For the second urn, however, if the urn turns out to have more black than red balls, and if you play the game long enough, your bankroll will experience an average downward trend. So, though I haven't worked it out, I suspect the uncertainty would overpower the edge, and the Kelly calculation would advise staying out of the game involving the second urn.
ReplyDeleteYeah, I use the Kelly criterion when I make bets, although it makes me nervous - there is no risk aversion built into it!
Delete>So, though I haven't worked it out, I suspect the uncertainty would overpower the edge, and the Kelly calculation would advise staying out of the game involving the second urn.
I don't think so. Remember that as you are playing the game, you should be continuously updating your "edge" - aka your estimate of the black/red ratio in Urn B (the easy way to do this is Laplace's rule of succession, mentioned above). So you *start* with a uniform prior that gives P(ball_1_red)=0.5, but if your first 5 balls are black, then by LROS, P(ball_6_red)=(s+1)/(n+2)=(0+1)/(5+2)=1/7.
Translating that into the language of the Kelly criterion, your "edge" should be changing with every draw. Realistically, after drawing even 1 black ball, Urn A becomes the better bet.
First of all, I think that Kelly strategy does embed risk, in the sense that placing larger-than-Kelly bets is counterproductive to long-term returns. Even placing Kelly bets can result in very wild swings in bankroll, although expected returns are maximized. Fractional Kelly bets are better (depending on how you quantify "better"), so the Kelly criterion should be seen as an upper bound.
DeleteThe key in my thinking is that the second urn is akin to a volatile investment like the stock market, and the first is akin to a stable investment like a certificate of deposit. An urn having less than 50% red balls is analogous to a bear market, and an urn having more than 50% red is analogous to a bull market. When you buy stocks, you aren't certain which type of market lies ahead, just as you don't know how many red balls are in the second urn.
The connection to Kelly strategy is that, if I recall correctly, a more volatile outlook encourages betting a smaller fraction of one's bankroll. The generalized form of Kelly strategy is to maximize the expectation of the logarithm of the outcome (instead of the expectation of the outcome), because successive return ratios are multiplied rather than added (e.g., gain 20%, then 20% on top of that, compounds to 44%: 1.2 * 1.2 = 1.44). But the closer a ratio is to zero, the more quickly the logarithm grows in the negative direction (gaining 20% then losing 20% is better than gaining 50% then losing 50%). It's asymmetric. In the Kelly strategy for the second urn, the terms representing fewer red balls (more expected losses) will have a more pronounced effect than the corresponding gain terms (more red balls).
Consequently, for a bet on the second urn to look favorable, higher payoff odds must be offered than for the first urn.
I understand your point about updating the edge as you go. Strictly speaking, you wouldn't apply Kelly strategy to a single bet. However, I don't think that diminishes the fact that the volatility makes urn B a less attractive bet on the first play -- unless the payoff is higher than urn A's. And the problem as stated was about placing a single bet, not evaluating the long-term prospects over repeated bets.
On the other hand, given a single bet, there are only two possible outcomes: lose your bet, or win the payoff amount. A choice of urn A or urn B is equally likely to produce a given outcome on a single draw. On reconsideration, my argument only applies to this scenario: You are buying N tickets (N > 1) for either Urn A or Urn B, and the N tickets do not apply to a single draw from that urn, they apply to a consecutive series of draws (with replacement). You must buy all N tickets prior to the first draw.
DeleteIn this scenario, Urn A is unlikely to produce N red balls in a row, or N black balls in a row. If N is large enough (say, N=20), the probability of either of those events is much less than 1%. But Urn B has at least a 1% chance of producing the all-red outcome, and at least a 1% chance of producing the all-black outcome. So Urn B is more volatile in this scenario.
Ian,
ReplyDelete> The thing that bugs me about the "it's a con!" answer is that you don't know *in which direction* it's a con - black or red. So the situation is more or less the same as the 50%-50% draw.
the point about "it's a con" is that you don't have to know - sleight of hand will always ensure that it is not in your favor. That's the whole point of a con. In 3-Card-Monte you always lose, no matter why card you pick, and that's why the sum of probabilities is less than 100%.
In other words, the way you seem to think of as situation 3 is not a con, it's a flawed setup. (Different distribution than promised.) Why would a con artist cheat in your favor?
Once you deal with people, it is a rational choice to distrust their motives. A bet with undisclosed odds triggers the BS filter, and if you have ever seen an Apollo Robbins act, you know what sleight of hand can do. (If you haven't, google him, it's amazing.) No, Urn 1 is definitely the rational choice.
Cheers
Christian
>the point about "it's a con" is that you don't have to know - sleight of hand will always ensure that it is not in your favor. That's the whole point of a con. In 3-Card-Monte you always lose, no matter why card you pick, and that's why the sum of probabilities is less than 100%.
DeleteGood point.
>No, Urn 1 is definitely the rational choice.
In real life, yeah, I think you're right. But if we stipulate away the possibility of cheating, so that in the constructed "logical environment," the ratio of balls (whatever it is) has nothing to do with which bet is being offered to you, then I'm still indifferent between the Urns.
I'll check out the Apollo Robbins acts I can find online... looks interesting!
Delete> In real life, yeah, I think you're right. But if we stipulate away the possibility of cheating, so that in the constructed "logical environment," the ratio of balls (whatever it is) has nothing to do with which bet is being offered to you, then I'm still indifferent between the Urns.
DeleteNo argument in the logical environment; and if you got a sample of readers of LW or similar to do the experiment, you would probably get an indifferent result. My point was more that the vast majority of people won't think about it in logical terms, but rather as a real life setting, which is why it's not a paradox. (As opposed to most the experiments Kahneman refers to.)
I'd wager ;-) that even here on RS a majority would choose Urn A...
Chris
P.S. One of the things I sometimes do in the course of my job is design questionnaires. Asking questions on the Ellsberg situation is sth I would try to avoid as far as possible - the answers you get will be extremely sensitive to wording.
P.S. Sorry, overread one of your answers. Even if you "told" me - there is a non-zero chance that you are lying, and that you will do a switcheroo after I choose, and the probability that is not in my favor is higher than that it is in my favor.
ReplyDeleteI agree with your lack of preference in Ellsberg's paradox, Ian. I disagree with the commenter who said that it's wrong to assume the odds are the same; that seems like a perfect example of an intelligent bayesian prior.
ReplyDeleteFor the stranger with the toothpaste, I don't think there are any sensible priors to use. Actually, I've got to weaken that statement. Statistically, the odds are greater that toothpaste was chosen twice because it was a more likely choice, more highly represented in the population. A blind prior (only considering results, not considering any real physical and psychological drivers of the situation) would favor toothpaste. A smarter prior, consisting of everything in the universe that could possibly (or even likely) be in the man's bag, would give us too big a field for a meaningful selection. I'm comfortable refusing to justify a choice.
>I don't think there are any sensible priors to use.
DeleteI agree, none that we know of. Conceivably, the bearded man's best friend might know more about why he's offering such weird bets, and have better priors.
In the Ellsberg case you can (arguably) apply a principle of indifference. Even if the urn is biased, you've no reason to think it's more likely to be biased in favour of black than red.
ReplyDeleteIn the octopus case you're probably not indifferent as between all the possible things the stranger may draw out of his bag. Given human nature and the nature of physical reality, some possibilities seem more likely than others. Perhaps more important, there isn't a well-defined set of possibilities over which we can have a probability distribution.
I think Elga is mistaken. Arguing against "the sequence proposal", he argues as if we can make three judgements of rationality in a two-bet scenario: about bet A, about bet B, and about the combined sequence. But Sally is making only two decisions, so there are only two judgements of rationality to be made. Furthermore, if Sally is told about both bets in advance (this is unclear) she is effectively making only one decision. She can decide about both bets at the same time, since nothing will change after the first bet, and we can simply judge the rationality of that one (combined) strategy.
A better version of SEQUENCE (call it SEQUENCE-2) would be this: It is rationally impermissible for Sally to reject both bets if she is told in advance that she will be offered both bets, even though it may be rationally permissible for her to reject each bet taken in isolation. There are three judgements of rationality to be made here, because the individual bets are being judged in isolation, not as part of a sequence. But Elga's argument is not applicable to SEQUENCE-2.
Yeah, I agree with your criticism of Elga here.
DeleteIt's interesting how our intuitions change if we slightly alter the scenario. Suppose that the first thing that you pull out from the bearded man's sack is some illegal substance like a packet of methamphetamine. The second is a Jellyfish and the third is smaller packet of methamphetamine. Now, what is your level of credence that the next object pulled out of the bag is a packet of methamphetamine?
ReplyDeletePretty high. The guy is carrying meth. The jellyfish is just for the police to pet.
The only difference here in comparison to the original scenario is, you're doing the pulling instead of the bearded man so you know your motives, and you have some kind of idea in what situations you might find illegal substances and jellyfish in a bearded man's sack while you have no idea in what situations you find toothpaste and jellyfish in a bearded man's sack. So our search space is narrowed down.
The tooth paste vs jellyfish bet is fairly interesting,
ReplyDeleteWhat will people do if we rephrase the problem as:
"the guy pulls [...] how much will you pay to see the next object?"
In this case it is equivalent to a lost bet in any case but people will probably pay a few buck for sake of curiosity.
If you compare the answer on two sample i) bet sample and ii) curiosity sample, the result will probably me amusing. My guess is that many more people in the second group will be happy to open the wallet even if the economic reward is clearly less.
Posing the problem in form of a bet or in form of price of curiosity will probably shift the attention from mere economic reward to personal satisfaction with the latter generally highly priced than an uncertain bet.