Friday, July 12, 2013

Betting: Using Psychology and Economics To Tax Bullsh*t

SUMMARY: People bet each other when they have different assessments about a belief. A bet is a tax on B.S. because it pins consequences onto a previously not-that-important belief or a belief that was protected from testing, so that being true becomes much more important; hence "putting your money where your mouth is". This is of obvious importance to rationalists. When you ask someone to enter into a bet - about anything - you find out what they really believe. The most interesting behavior arises when those bettors have incoherent beliefs that are exposed as having no effect on how they act, or where their stated beliefs actually differ from their actions. There are interesting concrete examples of how the tax on B.S. has been levied in some real-world situations. People can avoid the B.S. tax by refusing to bet, but they can't avoid the loss of status that comes from refusing to make predictions about something they claim to believe passionately.

Bets are a tax on B.S. To see why, it helps to understand why they can change our behavior or at least our stated beliefs. I was inspired to write this by a great series of online discussions by economists and political scientists at George Mason and Colorado (links below).

We humans "lower our standards" for having true beliefs when there's little consequence of having false beliefs. Another way of saying that is that making sure a belief is correct actually has a cost - in terms of gathering more information and critical thinking - and if the belief doesn't matter that much, it's not worth gathering much more information or thinking much more about it.

Concrete examples are useful here. Are you 100% sure that the way you drive from home to work is the fastest way, at the times you commute? Ih, probably, but even if the way you normally go isn't the fastest way, it's unlikely the different routes are different by more than a minute or so; probably not worth the extra hassle. Another example: you're playing a friendly game of bar trivia and you're asked to name all species of rattlesnakes that occur naturally in the San Francisco Bay area. You're reasonably sure you once read that there's only one, the northern Pacific rattlesnake, Crotalus oregonus. Now how about if instead, you're on a hiking trail and you were just bitten by a rattlesnake, and suddenly you're wondering if in fact you really are certain about the species distribution - or if instead you should kill the snake and take it to the emergency department with you so they can identify it and give you the right antivenin. If you think that's an obscure example, you should know exactly this happened to me, and at the time I realized what a great example it was of this principle: that the marginal value of certainty in a belief changes based on the consequences of that belief's being true or false.

And to be clear: choosing whether to trade your limited resources for more certainty in beliefs based on their importance would actually be rational and honest, if not giving a proposition much thought led us to declare "I don't know, I haven't given it much thought." (Unfortunately humans aren't good at recognizing and admitting that we don't know something.) And in fact this is (correctly) the way we allocate resources for science, both in the private and public sectors. Science is a process where we change certainty in beliefs by contriving situations with outcomes that differ as dramatically as possible, based on whether some specific belief is true or not. To contrive those situations, usually it takes time and money, which are limiting. Because there is limited time and money, but unlimited beliefs that we could be spending time and money to test, we focus on those beliefs which are deemed most likely to have some consequence in terms of money, well-being, or to further our general ability to determine which beliefs are worth testing. (To this end: every day it seems someone asks me a question about the cause of some comically minor possible medical complaint, to which I reply to my knowledge there's no research into it; and when they demand "Why not?" I say "because people are still dying of heart disease, cancer and Alzheimers, that's why.")

But that's not the only bad influence on the accuracy of our beliefs. For one thing, we often hold beliefs that are actually incoherent in terms of translating into action. Even when pressed, we can't ever give an example of how the belief will ever cause us to act differently than someone who holds an opposing belief. Such a belief-holder might insist that the belief is very important, out of all proportion to the belief ever seeming to make a difference. These "symbolic" beliefs may well serve as a sort of tribal loyalty signal. In this light, it's very interesting that Christians and atheists in the U.S. give almost identical answers to moral questions is very interesting here. This highlights a second service provided by bets, besides just the B.S. tax: bets also force us to define and agree on a real-world outcome in concrete terms.

We also damage the accuracy of our beliefs by protecting them, by deliberately avoiding any situations that dramatically highlight gaps between prediction and reality, both in ourselves and in those around us. This avoids having to update cherished beliefs or offending others. This is the exact opposite of science! Sam Harris observes the astounding difference in religiosity between scientists and the general population and concludes simply, "There is something profoundly hostile to religion in scientific thinking." (Add to that: Francis Crick said the number one enemy of effective collaboration in science is politeness.)

The last of the unfortunately numerous bad influences on the truth of our beliefs that I'll include here is that what we say we believe and how we actually act in situations described by our beliefs are often very different. This is called epistemic rationality (whether your stated beliefs are true) vs. instrumental rationality (whether how you actually behave achieves your goals). And most of us are loathe to admit that what we say and what we do are different. For instance, it's pretty easy to find young-Earth creationists receiving medical care from evolution-steeped medical doctors, or socking away retirement money in stocks of oil companies that look for deposits based on the idea that the world is billions of years old, or paying insurance premiums to companies that do not include prayer in their actuarial tables. It's not easy to find young-Earth creationists who like to talk about this gap in their beliefs and actions.

Which brings us back to betting. People bet each other when they have different assessments about a belief: the outcome of a game, the contents of someone else's hand of cards, or any other future event. And the bet is a tax on B.S. because now there are consequences pinned onto that previously not-that-important belief, and being true is now much more important. The bet also requires a concrete prediction agreed upon by all parties, so if the belief in question is incoherent, this becomes obvious. Finally, if what the bettor says they believe and how they actually act are in opposition, this also becomes obvious; the bettor must either endure a loss for the sake of preserving the appearance of their stated beliefs, or admit that their beliefs and actions are no aligned, or just refuse the bet. Because it's unpleasant to knowingly take a losing bet (and have your belief-action gap glaringly highlighted) and it's also unpleasant to admit such a gap, in these situations, people usually just avoid the bet. I'll close with some concrete situations, some of which have happened in very public ways.

Low-consequence bet: someone buys into their office March Madness pool for $5. They don't know anything about basketball but they're not motivated to improve their picks because it's only $5. There's not much B.S. here, and the tax is low.

An "honest", consistent false belief: you probably know a rabidly loyal sports fan who has made an over-optimistic bet about the performance of their team. They're "honest" in the sense that their stated beliefs and actions match. This person has a false belief ("my team is better") and they act on it. The B.S. tax may be large, as may be the humiliation of having your belief publicly dismantled. A historical example: Alfred Russell Wallace entered into a bet to (re-)prove the curvature of the Earth to a Flat Earther named John Hampden, and acquitted himself famously in this endeavor. This proved too much for Hampden, who spent the rest of his life in litigation for threats and libel he made against Wallace, so unbalanced was he by the experience: "Mrs. Wallace, Madam, if your infernal thief of a husband is brought home some day on a hurdle, with every bone in his head smashed to pulp, you will know the reason. Do tell him from me he is a lying infernal thief, and as sure as his name is Wallace he never dies in his bed." Cranks never change!

An incoherent belief: in contrast to people with "honest" false beliefs, people generally avoid betting about these because the difficulty of making concrete predictions exposes the belief's incoherence, and an openly nonsensical belief can't do its job as a tribal loyalty signal. Gay marriage opponents speak in vague terms about the degradation of marriage, but are unable to offer concrete predictions (or offer bizarre non-sequiturs). In 2009 Steve Chapman of the Chicago Tribune challenged social conservative public figures to give him concrete predictions (if not a bet) of what would happen as a result of gay marriage. All refused except one - because they didn't want to expose the incoherence of their claimed belief. The one who didn't refuse was the head of the National Organization for Marriage, and the list she gave was pretty much reflexive, i.e. "if gay marriage is legalized, then gay marriage will be legal". B.S. tax may be large, but it is more likely avoided - but only at the expense of a large tax on the claimant's credibility.

A gap between stated beliefs and actions: again people generally avoid betting on these, because to do so they either have to take the loss and remind themselves of their inconsistency (unpleasant), or admit there's a gap and update (also unpleasant). Running away is the least painful option. As an example, both in the build-up to the Harold Campocalypse, as well as the 2012 date, I looked online (hard!) for people who really believed what they said they did - that the world was going to end. My "bet" was this: I would give them $US1000, and they had to pay me back with 400% interest on the day after their end-date. If they were right, that's free money for them! Needless to say, no one ever took me up on it.


This post was inspired by an online debate among economists about the relationship of betting to beliefs, and other related points about betting emerged. First, to be useful, people's beliefs must make a difference without being artificially pinned to money (there's already a penalty to having false beliefs, just not an obvious immediate one); and also there is an effect of others knowing the beliefs you profess, sometimes far in excess of any financial gains or losses resulting from bets[1]. It's unpleasant to be shown that your belief was false, and you lose status in front of other people. For this reason, people often don't like bets of this sort, because it's confrontational; it "outs" bullsh*tters. (Seriously, read the article at that link, it's excellent.) Once someone is challenged to a bet or just to make a concrete prediction, there's no way out without someone losing status. The challengee then either has to accept (and risk losing the bet or losing status from making a bad prediction), or has to refuse the bet, and lose status from refusing to make decisions relating to something they claim to believe passionately. This may be why bullsh*tters often avoid bets by suddenly remembering a strong moral aversion to wagering, rather than just saying they don't want to bet. In those situations, you can charitably say "You know, I can't tell for sure that you're dishonestly looking for a way to avoid putting your money where your mouth is, but I can tell for sure that you're behaving exactly the same as someone who's doing that."

Yes, I guess that's confrontational, but I'm okay with confronting bullsh*tters. As a final remark, on issues of passionately-held and protected beliefs where word and deed conflict, or incoherent tribal-loyalty beliefs, you're probably not going to change someone's mind by confronting them this way or beating them in a bet. But you'll give fence-sitters something to think about.


Other great posts:
A Bet Is a Tax on Bullsh*t Bets and Beliefs (Alex Tabarrok, Economist, George Mason)
Bets Argue (Robin Hanson, Economist, George Mason)
Suspecting Truth Hiders (Robin Hanson, Economist, George Mason)
Why Religious Beliefs Are Irrationality, and Why Economists Should Care (Bryan Caplan, Economist, George Mason)
Why People Are Irrational About Politics (Michael Huemer, political scientist, U Colorado)

Below is a TED talk by Huemer, where he gives this useful rule for detecting bullsh*t in yourself: "If you think that the community of experts on this subject is wrong, and especially if you think that, while being unable to state their arguments, then you're almost certainly the one who is wrong."



[1]There is a recent striking example illustrating that the value of bets goes beyond their monetary value, because a bettor chose to continually take losses in order to send a signal that was presumably more valuable, in the now-shut-down prediction market Intrade during the 2012 U.S. presidential election. (A prediction market is basically a stock exchange of many people making bets, which ends up amalgamating their predictions usually into one amazingly accurate prediction.) In the days before the election when everybody but FOX pundits had called it for Obama, this market still only had Obama in the 70% range - in fact well into the night, when Ohio and Virginia had already been called, and Obama was almost a lock. As it turned out, in those final days there was one buyer who was continuously making bets on Romney, out of all proportion to others' predicted outcomes, and this kept Intrade's overall prediction tilted more in his favor than it should have been. If this bettor believed Romney would win, then this was just an honest false belief and they paid a B.S. tax for it. My guess is that this bettor (using campaign funds?) was trying to manipulate the market to keep Romney's chances high, and that the signal from Intrade's prediction was worth more to them than the money they lost. It has been speculated that Intrade, which had been operating for years, was suddenly prosecuted by U.S. agencies after the 2012 election because of people who didn't like the way it called bullsh*t on their beliefs.

No comments: