Wednesday, September 05, 2012

Confusion over the two kinds of rationality


by Julia Galef

Since I became the president of the Center for Applied Rationality, I’ve spent a lot of time talking to people about what rationality is. But I think I spend even more time talking to people about what rationality is not.

I don’t blame them for being confused, though. The word “rationality” is used in different ways by different people. Economists often use “irrational” to mean “acting against your own self-interest,” because their intentionally simplified models leave out altruism. Psychoanalysts seem to use “irrational” to mean “emotional.” Hollywood’s vision of rationality sets it up as a foil to emotion and love. And in colloquial conversation, “that’s irrational” often simply means, “I disagree with that.”

Even the meaning of rationality in philosophy and cognitive science is a little confusing. It doesn’t help that it’s a two-part definition — the word rationality is used to refer to both “epistemic” rationality, which entails having accurate views of the world, and also “instrumental” rationality, which entails pursuing your goals as effectively as possible. But people don’t know, or simply forget, to make that distinction, so many conversations about rationality end up with people talking past each other.

It’s especially hard to make the distinction if some actions qualify as epistemically rational but instrumentally irrational, or epistemically irrational but instrumentally rational. And this is one of the discussions I most frequently find myself in — unsurprisingly, since I am in the position of touting the benefits that epistemic rationality can have for your life. So, is it true? Is it sometimes helpful to you — that is, instrumentally rational — to be epistemically irrational?

Here’s the answer I’ve come to: Yes, occasionally. For example, you might feel happier if you irrationally believe that the present state of the world — or the future state of the world — is better than it really is. The down side to that kind of irrationality, of course, is that it prevents you from taking action to improve things. Believing “Everyone likes me!” is certainly more pleasant than believing “Actually, I get on a lot of people’s nerves.” But if the latter is closer to the truth, then you’re cutting yourself off from plenty of valuable friendships, relationships and professional opportunities if you let yourself believe the former rather than learning to improve your style of interaction.

You can also reap social benefits from being irrationally overconfident in your own traits — like your intelligence, or attractiveness, or skills — because your resultant confidence can persuade other people to view you more positively. People like confidence! In their leaders, especially, but also in their friends and their romantic partners. And it’s not just a matter of being attractive and charismatic. Other people have incomplete information about you, and one of the things they use to estimate your worth is your own apparent estimation of your worth (because they know that you have access to plenty of information that they don’t). Being overconfident, therefore, can make people trust you more, like you more, and be more attracted to you.

Of course, you can also suffer consequences from that overconfidence. In several studies, subjects with especially high self-esteem chose to take on more difficult tasks, failed more often, and ended up with less money than others. [1] In another study, traders who believed that randomly-generated successes were actually due to their own skill did more poorly in their actual trades. [2] (See Ray Kurzban’s Why Everyone (Else) is a Hypocrite for a discussion of these and related studies.)

Whence came the notion that overconfidence is helpful in decision making? It’s certainly widespread. I frequently get into debates with very smart people who suggest that being overconfident in, say, your chances of success with a new startup, is essential to getting yourself to take the plunge. And they might be right, but if they are, that’s because the comparison they’re (unintentionally) making isn’t overconfidence versus rationality. It’s overconfidence versus other kinds of irrationality which are keeping your hypothetical self from starting a business when he should, for his own sake. If your perfectly-calibrated self would still fail to start a business because of, say, irrational fear of failure, then yes — maybe overconfidence is a useful corrective for you. But you’d still be better off being perfectly-calibrated in your probability estimates and able to act rationally on that estimation.

Overall, if it were possible, it seems like the ideal strategy would be to show the world as high a level of confidence as you can (plausibly) claim, and yet retain as accurate a picture of the world as possible for the benefit of your own decision making. In other words, as Kurzban says, “holding aside the social effects of decision making, if you have an overly high opinion of your skills, then as long as the judge is cold, hard reality, you’re worse off than if you simply had the correct opinion of your skills, even if they are modest.” [4]

_____

[1] Baumeister, R.F., Heatherton, T.F., & Tice, D.M. (1993). When ego threats lead to self-regulation failure: Negative consequences of high self-esteem. Journal of Personality and Social Psychology, 64, 141-156.

[2] Fenton-O’Creevy, M., Nicholson, N. and Soane, E., Willman, P. (2003) Trading on illusions: Unrealistic perceptions of control and trading performance. Journal of Occupational and Organisational Psychology 76, 53-68.

[3] Kurzban, R. (2011).  Why everyone (else) is a hypocrite:  Evolution and the modular mind.  Princeton, NJ:  Princeton University Press.

[4] Kurzban, R. (2011), p. 114.

14 comments:

  1. Nice article. I'm guessing however, that in most cases the two forms of rationality are actually aligned. So while you may have to have this discussion a lot, it's important to point out that the exception is not the rule.

    Also, I'm sure you are familiar with Dunning-Kruger, but it's worth bringing up.

    http://en.wikipedia.org/wiki/Dunning–Kruger_effect

    Lastly, I would like to suggest another way besides the reverse Dunning-Kruger effect that competence (epistemic rationality) can lead to apparent (but not actual) incompetence. Ask an incompetent person to do something they aren't competent to do, and they will confidently agree. Ask a competent person, and they will think about how hard it will be to succeed. This is the first step towards actual success! But it can appear to be hesitance and uncertainty. And it actually IS uncertainty.

    I am a TV writer/producer, a field in which the success rate (on the air and popular) is around 2%. If you are really good, you could be at 4 or 5 percent success. Can you imagine happily telling an investor that there is a 95% chance of failure? That is why most people working in Hollywood have careers built around dealing with failure not success. Those of us who actually produce, however -- writers, actors, directors -- don't have that option. It leads to a lot of conflict of interest.

    ReplyDelete
  2. Thanks for this, especially the discussion on overconfidence. I'm not sure your epistemic / instrumental distinction is equivalent to the traditional distinction between theoretical and practical rationality (i.e. rationality of opinions vs. rationality of actions). I'm not sure because you call the second one 'instrumental', although not all types of practical rationality are instrumental (in the usual sense of finding the most effective way to achieve a given end).
    You can discuss what is the rational thing to do without a prior commitment to this instrumental assumption. In many real life contexts there is no apparent set of preferable options; a selection has to be made between alternatives which are, in varying degrees, unsatisfying. You may then say that you choose the least unsatisfying, but in practice this proves quite difficult, as you have to consider different (and often divergent) evaluation criteria.
    So what does the 'most effective way to reach a given end' reduce to, other than some rather unspecified connection of means to ends? Does it still make sense to call that 'effectiveness'?

    ReplyDelete
  3. Rationality might be the capacity to match abstracts to reality in a worldview. A worldview would be about oneself in the world, as constructed by oneself in the world. We abstract what the world may be, and oneself likewise, and match the abstracts with reality to progress. The question is whether this process requires me to conform to the world or the world to conform to me.

    I can rationally pursue my goals, or I can rationally concede to goals set for me by the world. They might be considered equal alternatives if rationality is merely matching to progress (sanity). If this were like the previous thread on A + progressive or A + reactionary, some might call themsleves R + oneself (Rand) or R + the world (Duty).

    One person can be vital if the world has been so badly managed as to allow it, but I favor the world. I would extend myself by factual knowledge from the world until I know both myself and world factually. I would test my subjectivity to the extent of turning 'a person in the world' into 'a person of the world', if possible.

    ReplyDelete
  4. Y'all must excuse my laser-like brilliance dogging this Blog. It's just my Randian attempt at being 'a person of the world' (*smirk).

    ReplyDelete
  5. Just a small correction, it's Robert and not Ray Kurzban.

    ReplyDelete
  6. I always thought that this blog should be called rationally thinking. Now I have to think rationally about why it is not called rationally acting.

    ReplyDelete
  7. I recommend Aspects of Rationality by Nickerson if you want to see the 101 different things that rationality may entail.

    books.google.com/books/about/Aspects_of_Rationality.html?id=tCBdaL6kO7AC

    ReplyDelete
  8. That should be Robert Kurzban, not Ray Kurzban.

    ReplyDelete
  9. It is neither possible nor desirable to live one's life based solely on rationality.

    ReplyDelete
  10. Hi Julia,

    Thanks for an interesting post, and one that gives me an opportunity to discuss a view I've been forming.

    I would quite agree that false (or irrational) beliefs can sometimes be advantageous. But I think your attempt to describe this situation in terms of epistemic and instrumental rationality runs into some problems.

    You wrote: "It’s especially hard to make the distinction if some actions qualify as epistemically rational but instrumentally irrational..."

    Sorry to be picky, but epistemic rationality is generally taken to be a property of beliefs (while instrumental rationality is a property of actions) so an action cannot be epistemically rational/irrational.

    Perhaps you meant to refer to the situation where someone has a rational belief about how best to act but nevertheless fails to act in accordance with that belief. I understand that failing to act in accordance with your beliefs is sometimes called "akrasia". (I'm sure you're familiar with that term.)

    Lately I've become quite doubtful about the usefulness of the term "instrumental rationality". Let's assume we agree to call the rationality of beliefs about how best to act a matter of epistemic rationality. Then what's left for instrumental rationality? It seems that all that's left is the matter of whether we act on our beliefs, i.e. akrasia and its converse. It seems to me that forming rational beliefs and acting on them are two such different things that I find it misleading to call them two varieties of "rationality". I would dispense with the term "instrumental rationality" altogether. We can still talk about whether our actions are advantageous (or conducive to our goals/desires). And we can also talk about whether we have acted in accordance with our judgements on how best to act. Is there anything else that we need to talk about in this context?

    And they might be right, but if they are, that’s because the comparison they’re (unintentionally) making isn’t overconfidence versus rationality. It’s overconfidence versus other kinds of irrationality which are keeping your hypothetical self from starting a business when he should, for his own sake.

    I think this would be more clearly stated by saying that overconfidence (epistemic irrationality about your abilities) might be useful for overcoming akrasia. In calling akrasia (as I interpret you) just another "kind of irrationality" I think you are obscuring a significant distinction.

    But you’d still be better off being perfectly-calibrated in your probability estimates and able to act rationally on that estimation.

    Sure, it would be best to have the best of both worlds: true beliefs and no akrasia. But akrasia is not so easily avoided, and there might really be a worthwhile trade-off available to some people.

    ReplyDelete
  11. P.S. On further reflection, I'm now thinking that my objection to the term "instrumental rationality" was misguided. I don't have time to say more right now, but I thought I'd mention my change of mind briefly, to save people wasting time addressing my objection.

    ReplyDelete
  12. "For example, you might feel happier if you irrationally believe that the present state of the world — or the future state of the world — is better than it really is. The down side to that kind of irrationality, of course, is that it prevents you from taking action to improve things"

    "Better" is a value judgment and therefore not something any rational process can determine. I might mistakenly believe "everyone likes me" and therefore reach wrong conclusions but maybe I don't care if people don't like me. All such reasoning about personal ends will make assumptions about what values people have.

    Emotions are not thoughts.

    ReplyDelete
  13. I think of values as abstract if they do not model reality as enacted by oneself in the world. If they apply reasonably in reality, they can be rationally assessed against other values and their realities. The rational assessment (applying abstracts to reality) would be to abstract new realities, as new values that might extend from existing values and their reconciliations (an ought from competing is's). Values are just stable paradigms for living, and need abstracts well matched to realities, or they are simply insane. Beyond sanity, however, values are constructed by each individual from unique personal experiences. Until we have settled upon secure objective parameters to that psychological construction to differentiate individuals' values (such as Freud sought in gender biology), we can only use narratives to explain the source of each person's values and how they are shared as societies.

    ReplyDelete

Note: Only a member of this blog may post a comment.