Since I became the president of the Center for Applied Rationality, I’ve spent a lot of time talking to people about what rationality is. But I think I spend even more time talking to people about what rationality is not.
I don’t blame them for being confused, though. The word “rationality” is used in different ways by different people. Economists often use “irrational” to mean “acting against your own self-interest,” because their intentionally simplified models leave out altruism. Psychoanalysts seem to use “irrational” to mean “emotional.” Hollywood’s vision of rationality sets it up as a foil to emotion and love. And in colloquial conversation, “that’s irrational” often simply means, “I disagree with that.”
Even the meaning of rationality in philosophy and cognitive science is a little confusing. It doesn’t help that it’s a two-part definition — the word rationality is used to refer to both “epistemic” rationality, which entails having accurate views of the world, and also “instrumental” rationality, which entails pursuing your goals as effectively as possible. But people don’t know, or simply forget, to make that distinction, so many conversations about rationality end up with people talking past each other.
It’s especially hard to make the distinction if some actions qualify as epistemically rational but instrumentally irrational, or epistemically irrational but instrumentally rational. And this is one of the discussions I most frequently find myself in — unsurprisingly, since I am in the position of touting the benefits that epistemic rationality can have for your life. So, is it true? Is it sometimes helpful to you — that is, instrumentally rational — to be epistemically irrational?
Here’s the answer I’ve come to: Yes, occasionally. For example, you might feel happier if you irrationally believe that the present state of the world — or the future state of the world — is better than it really is. The down side to that kind of irrationality, of course, is that it prevents you from taking action to improve things. Believing “Everyone likes me!” is certainly more pleasant than believing “Actually, I get on a lot of people’s nerves.” But if the latter is closer to the truth, then you’re cutting yourself off from plenty of valuable friendships, relationships and professional opportunities if you let yourself believe the former rather than learning to improve your style of interaction.
You can also reap social benefits from being irrationally overconfident in your own traits — like your intelligence, or attractiveness, or skills — because your resultant confidence can persuade other people to view you more positively. People like confidence! In their leaders, especially, but also in their friends and their romantic partners. And it’s not just a matter of being attractive and charismatic. Other people have incomplete information about you, and one of the things they use to estimate your worth is your own apparent estimation of your worth (because they know that you have access to plenty of information that they don’t). Being overconfident, therefore, can make people trust you more, like you more, and be more attracted to you.
Of course, you can also suffer consequences from that overconfidence. In several studies, subjects with especially high self-esteem chose to take on more difficult tasks, failed more often, and ended up with less money than others.  In another study, traders who believed that randomly-generated successes were actually due to their own skill did more poorly in their actual trades.  (See Ray Kurzban’s Why Everyone (Else) is a Hypocrite for a discussion of these and related studies.)
Whence came the notion that overconfidence is helpful in decision making? It’s certainly widespread. I frequently get into debates with very smart people who suggest that being overconfident in, say, your chances of success with a new startup, is essential to getting yourself to take the plunge. And they might be right, but if they are, that’s because the comparison they’re (unintentionally) making isn’t overconfidence versus rationality. It’s overconfidence versus other kinds of irrationality which are keeping your hypothetical self from starting a business when he should, for his own sake. If your perfectly-calibrated self would still fail to start a business because of, say, irrational fear of failure, then yes — maybe overconfidence is a useful corrective for you. But you’d still be better off being perfectly-calibrated in your probability estimates and able to act rationally on that estimation.
Overall, if it were possible, it seems like the ideal strategy would be to show the world as high a level of confidence as you can (plausibly) claim, and yet retain as accurate a picture of the world as possible for the benefit of your own decision making. In other words, as Kurzban says, “holding aside the social effects of decision making, if you have an overly high opinion of your skills, then as long as the judge is cold, hard reality, you’re worse off than if you simply had the correct opinion of your skills, even if they are modest.” 
 Baumeister, R.F., Heatherton, T.F., & Tice, D.M. (1993). When ego threats lead to self-regulation failure: Negative consequences of high self-esteem. Journal of Personality and Social Psychology, 64, 141-156.
 Fenton-O’Creevy, M., Nicholson, N. and Soane, E., Willman, P. (2003) Trading on illusions: Unrealistic perceptions of control and trading performance. Journal of Occupational and Organisational Psychology 76, 53-68.
 Kurzban, R. (2011). Why everyone (else) is a hypocrite: Evolution and the modular mind. Princeton, NJ: Princeton University Press.
 Kurzban, R. (2011), p. 114.