About Rationally Speaking

Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.

Wednesday, January 27, 2010

Is Google making us less rational?

Google may be making us all more knowledgable, but could it also be making us less rational? I've got a suspicion that online search engines are making us especially susceptible to at least one particular blunder: confirmation bias, the phenomenon by which you're more likely to seek out, notice, and remember evidence that supports what you already believe.

The term "confirmation bias" was first used in a classic 1960 paper by P.C. Wason called On the failure to eliminate hypotheses in a conceptual task. Subjects were given a sequence of numbers: "2,4,6" and told that the numbers were picked according to a rule, and the goal was for the subjects to guess that rule. They could test their guesses by suggesting other sequences and the experimenter would answer "Yes, that fits the rule" or "No, that doesn't fit the rule."

Most subjects started out by hypothesizing rules like "Even numbers increasing by two." They then tested their hypothesized rule by asking the experimenter about sequences that would conform to it, like "8, 10, 12?" The experimenter would truthfully respond, "Yes, that fits the rule," and the subjects would become more and more confident that their guess had been correct.

In fact, the correct rule was simply "Increasing numbers." That is, any increasing sequence of three numbers would have worked. In order to discover that their original hypothesis was wrong, subjects would have had to test it by asking about sequences that their hypothesis would have predicted would violate the rule, for example, "2,4,5." The experimenter would have responded "Yes, that fits the rule," and the subjects would then have known their hypothesis couldn't be right. Instead, they kept testing their hypothesis with sequences that did fit it, and kept getting affirmative replies, until they felt confident enough to announce that they'd figured out the rule -- only to discover that they'd been barking up the wrong tree all along.

Reading about this phenomenon, it struck me that online search engines like Google are, by their very nature, rich with potential for inadvertent real-world replications of Wason's experiment. Here's a personal case-in-point: I was having problems with my Dell netbook recently, and I was curious if it was a systemic problem with the brand. So I Googled Dell netbook cursor freeze, and found thousands of pages of people complaining of the same problem. Bingo! From then on, when someone asked me if I was happy with my computer, I would warn them not to get a Dell netbook because they have problems with their cursors.

But what if that wasn't the right hypothesis? Today, I tried Googling Acer netbook cursor freeze -- and got roughly the same number of hits. (Well, actually, I got half as many hits, but if you normalize by dividing by the baseline number of hits for just "Dell" and "Acer," respectively, that more than makes up the discrepancy.) Point is: I had a hypothesis (i.e., that my cursor froze because of a problem with Dell netbooks), I looked for evidence confirming that hypothesis, and promptly stopped searching after I found it, without proceeding to see if I could find any disconfirmatory evidence. The true hypothesis might be that my cursor froze because of a problem with netbooks in general, or maybe with touchpads, but I didn't find out because I stopped looking after I found evidence to support my original theory.

Even the way you phrase a search query can make a huge difference in the type of results you get, making you more likely to find the evidence you already expected to find. Another real-world case-in-point: A few weeks ago, I met a guy who insisted that feminism was invented by the CIA in a plot to control the world. Hard to imagine how someone could possibly believe that, right? But let's assume this fellow heard or read this theory about feminism somewhere, and he wanted to check its veracity. He might very well go to Google and search feminism cia. And if you do that, every single result on the first page of search results touts the feminism-CIA link, complete with elaboration and supporting "facts."

Of course, other search queries would give you very different results. A neutral query like feminism origin yields not a single mention of the CIA on the first page of results. And a disconfirmatory query like feminism cia debunk produces plenty of opposing viewpoints. But we just don't think to try those neutral and disconfirmatory queries -- we seem to be wired to search for confirmatory evidence, then close the case file.

And I predict that the problem is about to get more severe. On December 4th, Google announced that their page rank algorithm -- the formula that determines which search results you get in response to a query, and the order in which they appear -- will from now on be partly dependent on your personal past search history. Their blog explained, "Now when you search using Google, we will be able to better provide you with the most relevant results possible. For example, since I always search for [recipes] and often click on results from epicurious.com, Google might rank epicurious.com higher on the results page the next time I look for recipes." As SEED magazine's Evan Lerner perceptively noted last month, this can't help but amplify the confirmation bias effect. I love Google's increasingly eerie, near-telepathic ability to know just what I was looking for, as much as everyone else does. But what I'm looking for isn't necessarily what I should always find.


  1. This comment has been removed by the author.

  2. I just finished reading Massimo's Philosophy Now column explaining how scientists should rather ask questions instead of testing hypothesis.

    It seems to me the that the same may apply to Google, after all it is a (re)search engine. Thus one should try to use Google to gather data (asking questions) instead of testing hypothesis.

    It seems that the example of feminism and the CIA illustrates it. When you asked the question (what is the origin of feminism?) you got the answer you were looking for, when you tested the hypothesis (feminism was invented by the CIA) you got the nonsense.

  3. Is Google making us less rational? No. As you point out, it can be used just as easily for disconfirmatory queries as confirmatory ones. The responsibility still lies with the individual to not "close the case file" prematurely.

    . . .

    As for the personalized search feature: It's rather dangerous to speculate on the actual behavior of an algorithm whose official announcement repeatedly uses the word "might". It means that there's going to be factor-weighting, checks-&-balances, accommodations for special cases, and who-knows-what going on in the code that could make the actual outcome quite different from whatever strawman one assembles from reading a blurb. That said, I'm going to do it anyway:

    In SEED, Lerner writes that the gist of the feature is "that results users actually click on after a particular search will be ranked higher in subsequent searches with similar terms." This is basically for convenience. Often, searching again is the fastest way to get back to something I read before and want to review. Google doesn't increase my confirmation bias with this feature any more than my brain's ability to retrieve memories does.

    . . .

    One important thing to keep in mind about Google is that many people use it instead of bookmarks or URL's. For example, one of the most common Google searches is simply "facebook". (You can use Google Trends to compare the relative frequency of search terms.) In this case, "But what I'm looking for isn't necessarily what I should always find" simply isn't true. Google's primary responsibility is to provide a good search experience that meets its users' expectations, not maximize the number of challenges to its users' existing beliefs.

  4. Good point, Julia. I know that google can be a quick fix or shortcut to understand some things. But it can mislead if you don't already have general understanding of the issues that you are seeking to gain insight into.

    My husband tries to get people to not approach the Bible that same way. An older man he knows has a particular belief about how America ought to fit into the Bible. He wrote book about it and asked my husband to read it last week. My husband recognizes that same error in this mans writing.

    Taking ones ideas to the Bible or google and making em confirm it, neither is good.

  5. I am guilty of the "debunk X" search on a regular basis.

  6. Massimo,
    your hypothesis when searching Google for a solution to your netbook problem was, I think, that others were having that problem and there was a way to solve it, not that Dell netbooks are inferior. So, hypothesis (there's a solution), evidence (search results), testing of evidence (did any of the solutions you found in the search results work consistently?), resolution of the problem (hopefully). All very rational.

    In regard to the CIA/feminism link, that use of Google is comparable to poking one's head into a room that contains a large portion of the world and yelling, "Hey, I think I just talked to a crazy person. Anyone else in here crazy?"

    The more neutral query you mentioned is indicative of critical thinking skills, and something that needs to be taught in schools, starting at the elementary level. Google can only tell you what other people are saying, it can't teach you how to think or tell you what the right questions are, or who you should be asking. Perhaps Google might mention this in one of their blog posts? Of course, not everyone who uses the search engine will read the blog. A banner on the main page in large, red letters, "Use of this software may cause confirmation bias, selective thinking, bigotry, or a feeling of smug certainty in complete idiots," might be a bit too cynical.

    I agree that Google's "Personalized Search" has the potential to promote confirmation bias among those who don't possess critical thinking skills, or are not particularly tech savvy. I don't enable web history when I'm signed into Google, and I have my browser set to delete all cookies upon closing, so Personalized Search won't have access to previous sessions' searches.

    Unfortunately, I can't blame any of my own selective thinking or biases on a bad batch of cookies.

  7. I agree with you about the tendency toward confirmation bias and google Julia. But I think you miss that it could be confirmation bias + anything that facilitates it.

    Technology has simply served to highlight this weakness...there's nothing new here. At least until other search engines get in on the game and start colluding with the user in group think.

  8. " On the failure to eliminate hypotheses in a conceptual task."

    Well, that kind of leads me to say that, "that Popper guy had the right idea."

    But, if I accept an "observe and model" view, I am compelled to say that, "Dell has a real problem that needs to be fixed." Regardless of other manufacturers having the same problem or not.

    They both seem rational to me. Is there a philosophical compromise that can simultaneously accommodate both of these positions?

  9. In fact, the correct rule was simply "Increasing numbers."

    I'm not sure this is quite a fair test. When people are told there's a rule for something like this, they generally assume you mean a deterministic rule. "Increasing numbers" certainly isn't deterministic.

  10. As with all these discussions around the idea that the public is getting less educated, more irrational, more prone to media manipulation or whatever, I ask myself whether it was really so much better in ye olden times. At least today we do have access to a lot of information that would never have come our way thirty, let alone three hundred years ago.

    I have not read Massimo's column, but the way I see it, you always first ask a question, and then you formulate a hypothesis plus null hypothesis to put the same question in a way that makes it easier to see how your hunch for the correct answer to the question can be tested. Or to put it another way, formulating hypotheses is a crucial step to avoid confirmation bias - with the null hypothesis, you ask yourself: how would I know if my preconceived answer is wrong? (Which is precisely the question the religious believers never ask themselves.)

    So in my eyes, we scientists do ask questions all the time already, and getting rid of the hypothesis concept would serve us ill.

  11. This might interest some. It's a conversation between Michael Specter (author of, Denialism - http://www.michaelspecter.com/) and Chris Mooney. They touch a little on the internet and confirmation bias. Chris brings out the idea he's covered before on the internet, the "cluster effect".


    However, this blogpost seems more centered around individuals. The "google" spoken about, in my mind can almost be used as metaphor, as others have done in a more positive light.

    Michael Shermer recently (the edge question 2010) mentions how the internet provides an equal intellectual playing field.

    I think all these things are real to certain extents. Plus, the ability of self education is usually good, even when wrong at times.

    I also think the real problems (which near Chris' "cluster effect" and of course confirmation bias - which we are doing and are "automatically" done now if wish) are well underway with much of the discourse taking place today. Not to say positives aren't important, like Michael's note. However, a combination of factors leads to something more grotesquely influential than "cluster" and bias.

    So, my hypothesis of sorts goes, things like a "cluster effect", confirmation bias and peoples belief they're on intellectually equal ground lends itself to greater extremes of self-delusion and hostility. It's not different from other "similar" environment, what changes is the reliance on information, both right and wrong, to challenge other ideas which come at lightening speed. Mixed with levels of desired anonymity and conviction to causes, our shared walless environment enables greater access to group thinking, hostility to others and a false sense of security.

    or something like that....

  12. Is it confirmation bias if you're consciously aware that you're seeking sources that support your hypothesis?

  13. Chris T.

    In a wonderful book I highly recommend by Carol Travis and Elliot Aronson, 'Mistakes Were Made (but not by me)', which they discuss "Dissonance" theory, they break "confirmation bias" like this (though this is obviously not complete - but useful):

    ~ "So powerful is the need for consonance that when people are forced to look at disconfirming evidence, they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief. This mental contortion is called "confirmation bias" ~


    In my post I describe a bit of a paradox that probably needs more explaining if anyone actually cares to discuss it.

  14. Doh! I addressed my response above to Massimo rather than Julia. I'm so accustomed to considering this "Massimo's Blog" that confirmation bias caused me to disregard the evidence of Julia's previous posts, and the appearance of her name in the contributors list as well as at the bottom of the post.

    I blame my subconscious, the interwebs, and the CIA.

  15. I prefer looking up (rational) dissenting opinions when I hold a position rather than supporting ones. Maybe I suffer from anti-confirmation bias causing me to swing wildly from one position to another?

  16. Hurray falsification! Popper was right - we should attempt falsification and not just confirmations.

    However, I wonder what the experiment's statistics were like. When I read e sequence, my initial thoughts were "A C E ???" Alternating points along a sequence.

  17. Uhg.

    Just spent a good deal of time on a comment post and it disappeared and google blogger advised me to give this code to host.


    BTW, it disappeared after simply trying to submit. So, I'd suggest others SAVE comments before trying to submit.

  18. I certainly see how using search engines renders one susceptible to confirmation bias; but why would Google be different from any other mode of inquiry that requires active investigation? Is using Google this way any worse than looking something up in a library card catalog, finding what you expected, and ending your search too soon?

    One problem, I suppose, is that Google results are too good; that is, they hone in on what you want too quickly, meaning that you aren't forced to sift through as much material. Therefore you're less likely to accidentally happen upon something that goes against your bias.

    I will say this: I don't like the fact that Google is using personal search history to modify results; I didn't like it back when they first offered the functionality, and from your post, it sounds as though you can't turn it off now. If that's true, I'll be annoyed.

  19. I take umbrage to any snarky comments about the CIA starting feminism when they are so clearly linked through the color green, marsupials, and pottery.

    Such chutzpah, and on a blog called, "Rationally Speaking" no less.

  20. 2euclide said: "I just finished reading Massimo's Philosophy Now column explaining how scientists should rather ask questions instead of testing hypothesis. It seems to me the that the same may apply to Google, after all it is a (re)search engine. Thus one should try to use Google to gather data (asking questions) instead of testing hypothesis."

    That's a great connection! I think the problem is that it can be difficult to phrase a query in a way that's both focused enough to get at the particular question you're interested in, and also neutral enough not to bias your results. Also, I have to admit that in practice I'm not sure how one would go about rigorously "answering a question" instead of "testing a hypothesis." At least in the latter case you have a standard (as Mintman pointed out) for determining how confident you can be in the hypothesis' correctness based on the available data. In the former case, I'm not sure exactly what the research process would look like.

    lah wrote: "Google's primary responsibility is to provide a good search experience that meets its users' expectations, not maximize the number of challenges to its users' existing beliefs."

    Of course. The whole point of my article was to point out a side effect of Google pursuing their primary responsibility. I didn't mean to imply that Google should change its business model to avoid enabling confirmation bias -- just that we should be aware of the potential for inadvertent misuse.

    Steelman wrote, "your hypothesis when searching Google for a solution to your netbook problem was, I think, that others were having that problem and there was a way to solve it, not that Dell netbooks are inferior."

    Actually, no, it was the latter. My implicit hypothesis was that the problem was a result of the netbook being a Dell, as opposed to it being a netbook (or having a touchpad).
    Also, I cracked up at this: "In regard to the CIA/feminism link, that use of Google is comparable to poking one's head into a room that contains a large portion of the world and yelling, "Hey, I think I just talked to a crazy person. Anyone else in here crazy?" :-) Well said.

    Mintman wrote: "As with all these discussions around the idea that the public is getting less educated, more irrational, more prone to media manipulation or whatever, I ask myself whether it was really so much better in ye olden times."

    Sure, I have the same reaction to those kinds of alarmist articles. All I meant to claim in my article, though, is that search engines have the potential for enabling confirmation bias if we're not careful... It's true that I can't make any confident claims about whether the effect on our bias is worse than whatever else we would have been using to answer our questions if we didn't have Google (which is the relevant question, I suppose, if I want to claim that Google is having a negative effect on our objectivity). Perhaps my title overstates the case; blame it on my journalism background! :-)

  21. This comment has been removed by the author.

  22. TO: Massimo
    RE: A chance to defend your honor
    Where: Comment on Jerry Coyne's blog

    "Massimo PIGliucci is much worse in many respects" than Massimo Piattelli-Palmarini. Yes, not an error, he called you PIGliucci. Hopefully painful childhood memories are theraputized by now. I defended your honor and hoped you laid some smackdown down. Go get 'em!

  23. After I had a comment sucked into the Google void I was going to forget about this thread unless someone replied to me - ah, but you're all to smart, realizing the favor done (evidence of god you may have even thought :)

    However, thanks to Roy's comment I thought what the hell (so, thank Roy - sorry Roy :)

    After this blogpost and my comment with my hypothesis, I went back to read more of the latest Edge.org's Question: "How Has The Internet Changed The Way You Think?"

    Alan Alda finishes out his comment in an interesting way that I think not only touches on this blogpost (Julia, give Alan a call - I see an episode of Scientific American Frontiers in your future, if it's still on) and my comment.

    Alda ~ ~"In addition, the Internet has connected so many millions of us into anonymous online mobs that the impression that something is true can be created simply by the sheer number of people who repeat it. (In the absence of other information, a crowded restaurant will often get more diners than an empty one, not always because of the quality of the food.)

    Speed plus mobs. A scary combination. Together, will they seriously reduce the accuracy of information and our thoughtfulness in using it?

    Somehow, we need what taking our time used to give us: thinking before we talk and questioning before we believe.

    I wonder: is there an algorithm perking somewhere in someone's head right now that can act as a check against this growing hastiness and mobbiness? I hope so. If not, I may have to start answering the phone again."~

    Most comments on Edge I've read on this Question appear neutral to positive, very few exceptions and Alan's goes a step further than anyone really. I think some of the optimistic veer a bit into being irrational. Reading (and sometimes between the lines) answers by Scott Atran and Sam Harris' "The Upload Has Begun", one has to wonder if they truly considered their answer's through.

    The truth and potential truths within the most optimistic ones, like Atran, Harris and Shermer's - I could also have made some of the same arguments for the printing press. Take Harris' for example and supplement the idea of internet with written word, printing press or language. Huge positives have been obtained, even if we go so far as Atran we can find agreement and point to written peace treaties hammered out over written correspondences.

    However, there has been down sides for the above. My hypothesis obviously goes further. That few, and only very subtle so as not to dampen the positives did anyone mention negative type of networks at play that demonstrate great abuse for an in-group to out-group that come and divide much quicker, stealthier , group think oriented and advocate and act-out direct public hostility to others with more impact then ever before (and I predict it getting worse). All of which can be fomented, even manipulated by what Alda, this blogpost and I have eluded to.

  24. Roy,

    once again, I have no idea what you are referring to. My take on hypotheses etc. has nothing whatsoever to do with evolution, let alone with intentionality. If you are genuinely curious, you'll find the full version in the last chapter of my 2006 book with Jonathan Kaplan, Making Sense of Evolution. (The chapter is about hypothesis testing in general, not just in evolutionary biology.)

    Norwegian Shooter,

    thanks for the defense, but that guy really needs a life, not worth my time. Still, I've been mulling another blog post on Coyne... The irony is that these guys don't know that I just wrote a scathing review for Nature of Fodor and Palmarini's book!

  25. This comment has been removed by the author.

  26. This comment has been removed by the author.

  27. New media and polarization is something that is still relatively under researched. And confirmation bias is arguably a fragment of this potential polarization.

    The thing about searching for feminism and cia is that you'll find links that supports it AND debunk it. Since these links do mention both feminism AND cia in them.

    The ranking based on past search history is a potential problem though. But then again, it depends on the users as well. Can't be too technologically deterministic. =)

  28. This comment has been removed by the author.

  29. I think it's brilliant to issue abstruse pronouncements about others' ideas, and then demand that those people go do some research in order to understand what the hell they were just accused of, so that they can then address the pronouncements directly. I'll bet it's highly effective; probably works every time. Everybody I know has endless time to do this, and vast pools of motivation to do so.

  30. This comment has been removed by the author.

  31. people, calm down, this is a forum for civil and productive discussions, ok?

  32. Apologies, Massimo. Guess I had a cantanker sore that day.


Note: Only a member of this blog may post a comment.