About Rationally Speaking
Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.
Wednesday, January 27, 2010
Is Google making us less rational?
The term "confirmation bias" was first used in a classic 1960 paper by P.C. Wason called On the failure to eliminate hypotheses in a conceptual task. Subjects were given a sequence of numbers: "2,4,6" and told that the numbers were picked according to a rule, and the goal was for the subjects to guess that rule. They could test their guesses by suggesting other sequences and the experimenter would answer "Yes, that fits the rule" or "No, that doesn't fit the rule."
Most subjects started out by hypothesizing rules like "Even numbers increasing by two." They then tested their hypothesized rule by asking the experimenter about sequences that would conform to it, like "8, 10, 12?" The experimenter would truthfully respond, "Yes, that fits the rule," and the subjects would become more and more confident that their guess had been correct.
In fact, the correct rule was simply "Increasing numbers." That is, any increasing sequence of three numbers would have worked. In order to discover that their original hypothesis was wrong, subjects would have had to test it by asking about sequences that their hypothesis would have predicted would violate the rule, for example, "2,4,5." The experimenter would have responded "Yes, that fits the rule," and the subjects would then have known their hypothesis couldn't be right. Instead, they kept testing their hypothesis with sequences that did fit it, and kept getting affirmative replies, until they felt confident enough to announce that they'd figured out the rule -- only to discover that they'd been barking up the wrong tree all along.
Reading about this phenomenon, it struck me that online search engines like Google are, by their very nature, rich with potential for inadvertent real-world replications of Wason's experiment. Here's a personal case-in-point: I was having problems with my Dell netbook recently, and I was curious if it was a systemic problem with the brand. So I Googled Dell netbook cursor freeze, and found thousands of pages of people complaining of the same problem. Bingo! From then on, when someone asked me if I was happy with my computer, I would warn them not to get a Dell netbook because they have problems with their cursors.
But what if that wasn't the right hypothesis? Today, I tried Googling Acer netbook cursor freeze -- and got roughly the same number of hits. (Well, actually, I got half as many hits, but if you normalize by dividing by the baseline number of hits for just "Dell" and "Acer," respectively, that more than makes up the discrepancy.) Point is: I had a hypothesis (i.e., that my cursor froze because of a problem with Dell netbooks), I looked for evidence confirming that hypothesis, and promptly stopped searching after I found it, without proceeding to see if I could find any disconfirmatory evidence. The true hypothesis might be that my cursor froze because of a problem with netbooks in general, or maybe with touchpads, but I didn't find out because I stopped looking after I found evidence to support my original theory.
Even the way you phrase a search query can make a huge difference in the type of results you get, making you more likely to find the evidence you already expected to find. Another real-world case-in-point: A few weeks ago, I met a guy who insisted that feminism was invented by the CIA in a plot to control the world. Hard to imagine how someone could possibly believe that, right? But let's assume this fellow heard or read this theory about feminism somewhere, and he wanted to check its veracity. He might very well go to Google and search feminism cia. And if you do that, every single result on the first page of search results touts the feminism-CIA link, complete with elaboration and supporting "facts."
Of course, other search queries would give you very different results. A neutral query like feminism origin yields not a single mention of the CIA on the first page of results. And a disconfirmatory query like feminism cia debunk produces plenty of opposing viewpoints. But we just don't think to try those neutral and disconfirmatory queries -- we seem to be wired to search for confirmatory evidence, then close the case file.
And I predict that the problem is about to get more severe. On December 4th, Google announced that their page rank algorithm -- the formula that determines which search results you get in response to a query, and the order in which they appear -- will from now on be partly dependent on your personal past search history. Their blog explained, "Now when you search using Google, we will be able to better provide you with the most relevant results possible. For example, since I always search for [recipes] and often click on results from epicurious.com, Google might rank epicurious.com higher on the results page the next time I look for recipes." As SEED magazine's Evan Lerner perceptively noted last month, this can't help but amplify the confirmation bias effect. I love Google's increasingly eerie, near-telepathic ability to know just what I was looking for, as much as everyone else does. But what I'm looking for isn't necessarily what I should always find.