About Rationally Speaking


Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.

Wednesday, August 01, 2012

The question of belief, part II


by Massimo Pigliucci

[The First part of this essay, on William James’ “The Will to Believe,” appeared here.]

Last time I examined in detail James’ famous “Will to Believe” essay, and found it disappointing to say the least. In fact, it is so badly argued that it is hard to imagine why it is so widely cited (maybe most people who cited it haven’t actually read it?), or why it has not contributed to at least questioning James’ credibility as a first rate philosopher (he did write plenty of other nonsense, you know).

At any rate, it’s now time to turn to an examination of the other classic about belief, this one on the side of skepticism: William K. Clifford’s “The Ethics of Belief,” published in 1877. I will not spoil anything if I tell you right now that I found it much, much more convincing than James’ effort. But let’s take a look at the details.

Clifford’s essay is divided into three parts: The duty of inquiry, The weight of authority, and The limits of inference. The “duty of inquiry” begins with a couple of hypothetical examples, one concerning a ship owner who has reasons to suspect that his ship needs substantial repairs but who manages to (genuinely, ex hypothesis) convince himself that she can take another voyage at sea. Turns out she couldn’t: the ship, cargo and crew are lost, and the owner pockets the insurance with a clear conscience. For a modern equivalent, think of the Wall Streeters who caused the 2008 collapse of the world economy — sans the assumption that they did not know better.

The second example will remind the modern reader of Faux News: it concerns some “agitators” who publish unfounded pamphlets for political-religious purposes, with the aim of discrediting people belonging to another faction. Turns out, there was no factual basis to the agitators’ accusations (think Sarah Palin and the “death panels,” to pick on one of too many pertinent recent cases).

Clifford makes an important distinction between the truth / falsity of a given notion and whether one has sufficient grounds to believe said notion: “the question is not whether their belief was true or false, but whether they entertained it on wrong grounds. ... Every one of them, if he chose to examine himself in foro conscientiae, would know that he had acquired and nourished a belief, when he had no right to believe on such evidence as was before him; and therein he would know that he had done a wrong thing.”

I love the image of a “foro conscientae,” and have amused myself thinking that maybe even Sean Hannity has such a locus in his conscience, where he does — however dimly — realize that he is no different from Clifford’s agitators. But I quickly concluded that Hannity probably does not have any such foro. Still, the point remains, Clifford frames the issue of belief as a moral one. As he famously put it, in the very same first section of the essay, “it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.”

But why, exactly? On what grounds does Clifford claim that believing something without (or, worse, in spite of) sufficient evidence is not just an epistemic error, but amounts to a moral one? Because “No real belief, however trifling and fragmentary it may seem, is ever truly insignificant; it prepares us to receive more of its like, confirms those which resembled it before, and weakens others; and so gradually it lays a stealthy train in our inmost thoughts, which may someday explode into overt action, and leave its stamp upon our character for ever.” That is, bad beliefs, and a general attitude of not looking seriously into the evidence, form a mental habit that leads to un-virtue. You begin by accepting small bullshit, you end up believing in big bullshit. You don’t question astrology and horoscopes, you end up believing an unscrupulous politician who tells you that we need to invade another country because it harbors invisible weapons of mass destruction...

Clifford also puts forth a positive case for the morality of examining one’s beliefs: “Our words, our phrases, our forms and processes and modes of thought, are common property, fashioned and perfected from age to age; an heirloom which every succeeding generation inherits as a precious deposit and a sacred trust to be handed on to the next one, not unchanged but enlarged and purified, with some clear marks of its proper handiwork. ... An awful privilege, and an awful responsibility, that we should help to create the world in which posterity will live.” That is, we shouldn’t just be concerned about leaving the world a better place to our posterity in terms of, say, less financial debt and more environmental cleanliness, but also enriched by new truths and cleansed of some previously held baloney.

All right, a reasonable objection might go, but I’m a busy person, I don’t have time to investigate every belief I contemplate throughout my life. Clifford’s suggestion may be moral, but it surely sounds impractical. Which brings us to the second part of the essay, on “the weight of authority.”

Clifford begins by pointing out that there is little actual danger of people falling into a systemic skeptic paralysis as a result of his suggestion that beliefs should be examined against the evidence. To begin with, often we are epistemically on solid ground when we believe something probabilistically, as certainty is far too high a standard in most cases. More importantly, Clifford maintains that we may be entitled to believe things on the ground of someone else’s testimony, as long as we have good second-order reasons to think that said testimony is accurate and credible.

A common problem — says Clifford — is that too many people mistake a person’s good character for an indicator of the truth of what that person says. His example is a contrast between none other than the prophet Mohammed and the Buddha! Even assuming both people being of good character, “The Prophet tells us that there is one God, and that we shall live for ever in joy or misery, according as we believe in the Prophet or not. The Buddha says that there is no God, and that we shall be annihilated by and by if we are good enough. Both cannot be infallibly inspired; one or other must have been the victim of a delusion, and thought he knew that which he really did not know. Who shall dare to say which? And how can we justify ourselves in believing that the other was not also deluded?” Wow. Remember, this was 1877, way before Richard Dawkins and the New Atheists!

Does this mean that all human testimony is therefore suspect? Not at all: “If a chemist tells me, who am no chemist, that a certain substance can be made by putting together other substances in certain proportions and subjecting them to a known process, I am quite justified in believing this upon his authority, unless I know anything against his character or his judgment.” That is, one can believe (to a degree, conditionally, probabilistically) the opinions of credentialed experts in a particular domain, especially if such opinions represent a consensus among those who have expertise in that domain.

The issue of expertise is a complex one, which I tackled a bit in chapter 12 of Nonsense on Stilts. Sometimes there is no consensus among experts (string theory), or the consensus opinion may turn out to be wrong after all (there is no aether), or the domain of expertise itself is actually empty (astrology, theology). Still, the issue is that — other things being equal — your best bet in any given case is to tentatively accept expert opinion unless you are an expert yourself, just like you would bring your car to be fixed by a certified mechanic, unless you yourself know a lot about cars.

What about “the limits of inference,” which constitutes the last part of Clifford’s essay? The starting point here is that we all make inferences, because even the most basic human knowledge cannot be based only on direct experience. Here is a simple example: “A little reflection will show us that every belief, even the simplest and most fundamental, goes beyond experience when regarded as a guide to our actions. A burnt child dreads the fire, because it believes that the fire will burn it today just as it did yesterday; but this belief goes beyond experience, and assumes that the unknown fire of to-day is like the known fire of yesterday. Even the belief that the child was burnt yesterday goes beyond present experience, which contains only the memory of a burning, and not the burning itself; it assumes, therefore, that this memory is trustworthy, although we know that a memory may often be mistaken.”

These are profound observations, pointing out two general features of human inference that ought always be kept in mind: first, it depends on the trustworthiness of our own epistemic tools, in this case memory (but also perception, and of course reason itself). These tools are limited and faulty, which means that no human being can claim perfect knowledge of very much out there. Second, our inferential ability rests on a fundamental assumption about the continuity of the world and the natural laws that regulate it. This is the well known problem of induction raised by David Hume, of course.

Clifford, like Hume, takes a pragmatic approach to the problem: “Are we then bound to believe that nature is absolutely and universally uniform? Certainly not; we have no right to believe anything of this kind. The rule only tells us that in forming beliefs which go beyond our experience, we may make the assumption that nature is practically uniform so far as we are concerned. Within the range of human action and verification, we may form, by help of this assumption, actual beliefs; beyond it, only those hypotheses which serve for the more accurate asking of questions.” It is a position that is at the same time extremely powerful and epistemically humble. Yes, using faulty tools and making certain assumptions are unavoidable limitations on human cognition. But has anyone got a better idea? Certainly not the prophets, ideologues and charlatans that infect the air of modern discourse, just as they did in 1877.

22 comments:

  1. Quote: "These [epistemic] tools are limited and faulty, which means that no human being can claim perfect knowledge of very much out there."

    Oh yeah? Then how do you explain Richard Dawkins and Jerry Coyne -- who claim perfect knowledge about nearly everything?

    ReplyDelete
    Replies
    1. When did Dawkins or Coyne ever claim to have "perfect knowledge" about anything? You are either being disingenuous, twisting their words, or both. As a scientist (geochemist), I would never claim to have perfect knowledge about my field. As a theist, you are observing the world through the lens of theism. Therefore, you seek "perfect knowledge" and you assume that everyone else does, as well. I seek knowledge, so that I can know more about the universe, but there is no such thing as "perfect knowledge". Those who claim that they have "perfect knowledge" about any topic are liars, fools, or both.

      Delete
    2. From Dawkins’ River Out Of Eden, A Darwinian View Of Life, (p.155):
      “In a universe of blind forces and physical replication, some people are going to get hurt, others are going to get lucky, and you won’t find any rhyme or reason in it, nor any justice. The universe we observe has precisely the properties we should expect if there is, at bottom, no design, no purpose, no evil and no good, nothing but blind, pitiless indifference.”

      Dawkins seems to have precise knowledge, although even though perfect is precise, precise might not for some be perfect.

      Delete
  2. As admirable as this proposal is (i.e., that, for any p, one should believe p only if one has sufficient evidence for p), it has serious problems that need to be taken into consideration:

    1. When to stop? To believe p, I need sufficient evidence q. To believe q, I need sufficient evidence r, and so on, ad infinitum.

    2. What counts as sufficient? How much evidence is enough? Not an easy question to answer. I certainly do not want to defend James, but one could interpret him as arguing that, when the stakes are high, even insufficient evidence is good enough.

    ReplyDelete
    Replies
    1. Moti,

      those are good considerations, but they are addressed by Clifford. That is why he talks about the conditions under which we can trust other people's authority, or when he says that sometimes we need to talk about probabilities of belief.

      None of that, however, is any comfort to James, who actually refers to faith, i.e. belief in the absence or even against the available evidence.

      Delete
    2. This comment has been removed by the author.

      Delete
    3. Moti,

      First, evidentialism is not committed to classical epistemic foundationalism -- one can advance evidentialism and espouse a flavor of coherentism, neo-foundationalism, some admixture of some versions of the two, or even something like Peircean pragmaticism / epistemic iteration.

      Second, to address your (2) in reverse order, whatever "sufficient" means it seems it must at least mean "good enough". So, I take your final bit to be incoherent.

      Now many / most / almost all formal epistemologists want to define "sufficient evidence" in probabilistic terms. Thus, S is justified in holding p if and only if S's evidence e for p at some time t is sufficient :: e makes p at t > .50.

      Of course the evidentialist might want to unpack this in detail, but it suffices to present an at first pass plausible account for "sufficient".

      Evidentialism

      Delete
    4. I guess I should add that S is justified in holding p if and only if S's evidence makes p more likely to be true than false at t and S holds p on the basis of the evidence at t.

      Delete
    5. I’m not sure that your definition really avoids the regress problem.

      You write: “S is justified in believing that p iff S’s evidence e for p makes p > .50.”

      Is S justified in believing e? If so, then S is justified in believing that e iff S’s evidence e2 for e makes e > .50. Is S justified in believing e2? If so, then S is justified in believing that e2 iff S’s evidence e3 for e2 makes e2 > .50. And so on, ad infinitum.

      Also, would you say that S is justified in believing that p when S’s evidence e for p is such that p = .51?

      Delete
    6. Moti,

      Nothing I wrote avoids the regress problem because nothing I said was aimed at avoiding the regress problem.

      Re: the regress problem.

      First, it's a problem for traditional epistemology, which is in my view (and many other formal epistemologists') an idle intellectual activity unrelated to real issues concerning knowledge, inference, and rational decision making.

      Second, even if I were to want to enter into traditional epistemological problems, I would not concern myself with the regress problem because I am not a foundationalist. I find Hasok Chang's idea of epistemic iteration (which is essentially Peircean pragmaticism) much more congenial.

      To answer your question in a brief way, yes. S is justified in believing p if and only if S's evidence for p makes p > .50 (and S believes p on the basis of the evidence).

      Delete
  3. Massimo:

    Would you agree that there is a case to be made for belief in the face of the absence of *apparent*evidence?

    I'm thinking here of the "split brain" experiments, where the right brain is shown an item hidden to the left brain. When questioned, the person is unable to name the item (the right brain cannot articulate). Yet the person can select the item shown with the hand controlled by the right brain.

    In short, you can "know" something without being able to articulate it, or give evidence for why you know it. I suspect that intuition may be made up of this type of knowing.

    ReplyDelete
    Replies
    1. Tom,

      good point. To begin with, however, please note that split brains are dysfunctional brains, by definition. Normally, the whole brain would have perfectly good reasons to know what one hemisphere claims to know.

      Moreover, no, I don't think that's intuition, at least not in the sense in which cognitive scientists think about it. Intuition has more to do with the subconscious massive, parallel processing of information. And it certainly has nothing to do with faith...

      Delete
    2. Massimo:

      While "split brains" are indeed unusual and dysfunctional, I was only offering the experiment as a particular example of a more general principle: it is at least *possible* to have worthwhile knowledge without being able to articulate it or give evidence
      for it. I went on to suggest that intuition *may* be something operating in a similar manner for the average
      person.

      Here's a story from my youth:

      One time my older brother walked into the room and I was
      suddenly struck with the strong impression that he had
      been to the barber shop. This made no sense, as my
      brother and I had both been to the barber shop only
      a couple of days earlier -- so why should he go again?

      But he *did* go again: to buy a bottle of the hair oil
      that the barber typically put on us after cutting our
      hair. My older brother was staring to notice girls,
      and went back to get a bottle of hair oil in order to
      try and impress the ladies. This hair oil had a
      distinctive scent -- I had smelled the scent, made the
      association with the barber shop and came to my
      conclusion that he had been there. All very logical
      and Sherlock-Holmesian. Except that I wasn't conscious
      of the chain of logic -- I merely had a strong intuition.

      *Something* (unknown to me) simply screamed "barber shop" to me. If you had asked me why I believed what I did I would have not been able to explain or give a reason any more than a split brain subject could explain why he selected "car keys".

      From this experience and the "split brain" experiments
      I believe that it may be possible to have worthwhile
      and accurate knowledge for which one cannot give evidence. Do you see *no* connection to faith?

      True, such intuition may lead us astray. But a wise man
      once said that, "[epistemological] tools are [also] limited and faulty, which means that no human being can claim perfect knowledge of very much out there."

      Delete
    3. Tom,

      we have all sorts of "knowledge" that we cannot articulate, for instance the calculations necessary for a baseball player to hit a fast approaching ball. (The reason I put knowledge in scare quotes is because in philosophy typically knowledge means justified true belief, and the baseball player can't justify his belief, so we should probably use a different word, which may be the root of our disagreement here.)

      Still, no, I don't see any connection between intuition and faith, unless you are using the word faith in a much broader sense than James, which I would not advise, since the term has a standard meaning, sticking to which helps avoid confusion.

      Delete
    4. Well definitions are tricky (the ghost of Wittgenstein smiles).

      I am using the following:

      FAITH: belief in the absence or even against the available evidence.

      INTUITION: knowing without the use of *demonstrable* rational processes.

      KNOWLEDGE: something perceived directly with the senses or the mind.

      * * * *
      Under these definitions, couldn't faith be considered as a type of "knowing" (with or without scare quotes) that cannot be articulated but is none the less valid? That is, it is possibly a rational process, but not one that is *demonstrably* rational, and thus should not be dismissed as trash any more than the split brain subject's correct selection of "car keys" should be dismissed as trash simply because he cannot explain it? Faith may be a message sent to us by our right brains that knows something that it is unable to communicate in words.

      Massimo, you suggested that the left hemisphere (L.H.) has access to whatever the right hemisphere (R.H.) claims to know. But, is that really true? Has cognitive science conclusively shown that 100% of what the R.H. knows is transmitted via the corpus callosum?

      As a thought experiment, suppose that John (R.H.) is mute, but communicates via sign language to Mary (L.H.) who can speak to us. Mary tells us, "John is hungry and would like a chicken sandwich". But, does Mary know 100% of what John is thinking? Suppose John wants mayonnaise on his sandwich, but has no sign (that Mary understands) for "mayonnaise"? Suppose John wants "free range" chicken but John is too lazy to do the tedious symbols needed for "free range"?

      Since the L.H. is verbal and the R.H. is non-verbal, it's a good bet that a lot gets lost in the corpus callosum translation. The R.H. may know things that get communicated in other ways: by sending the L.H. a strong impression (faith?).

      To ask the R.H. to explain itself in L.H. terms is similar to Winston (an English speaking man) asking Kim (a Chinese speaking woman) the question "how much is 2+2"? If Kim fails to answer (in English!), then Winston concludes that "Kim is an idiot -- she doesn't even know how much 2+2 is"! But Kim may be a Ph.D. in Mathematics and she may know a great deal that she cannot communicate to Winston. Winston is simply applying standards that are arrogant, parochial, and pig-headed (Ah! Why do my thoughts go back to my first post about Dawkins and Coyne?).

      Dawkins and Coyne -- never you Massimo! You are always reasonable and open-minded.

      Delete
    5. Tom,

      not sure whether your last comment about Dawkins and Coyne is meant as sarcasm or as a compliment, but I choose to interpret it as the latter!

      I never said I know for certain that there is 100% communication between the two hemispheres, all I said is that one needs to be careful about extrapolating from a clearly damaged brain to a normally functional one.

      I agree with your definitions of faith, disagree slightly with your definition of intuition (as I said, in cognitive science it is usually thought of as the result of massive subconscious parallel processing by the brain, which means that its results can be subjected to rational scrutiny, after the fact), and I definitely disagree with your definition of knowledge (again, for most philosophers the term applies to justified [likely] true belief, meaning that one has to be able to explain why one believes X, or believing in X does not count as knowledge).

      Delete
    6. Hi Tom, we dont know where our thoughts come from, neurologicaly. they just pop into our awareness from unknown processing in generally sensible sequences. One pop might sequentially lead to another, or at times we might have something pop into awareness that is not strictly sequential (eureka, as opposed to mathematical steps).

      You might experience a pop of a vague suspicion as a thought, which has a basis in subtle sensations, and it might be stronger than an idea arising for no reason at all (let's assume it is, as it has some real basis). Anything can pop. The program would be to evaluate the thought, even if its a vague suspicion difficult to pin point reasons for, and find its basis so that we can be satisfied with it (or otherwise) and move on from it. We can move vaguely through life with greater or lesser strength of bases for our vague ideas, or we can attack them and understand how one as an individual tends to call up their passing ideas.

      Knowledge is actually just a level of satisfaction in reasoning about 'beliefs' that have any bases (nonsensical or otherwise), and when thoughts pop, our responsibility is to be satisfied with them if we can, or go with them if we trust ourselves from past self-analyses. So, I wouldn't worry about the greater or lesser bases for thoughts. And I don't worry about thoughts just popping from those basis so that we can be aware and work with them. The idea is to be satisfied with them, and that's at different levels for people who are more vague or less vague (an indicator). I try to be less vague but near the edges of knowledge, for example.

      Delete
    7. Paul, thanks for your comments.

      And Massimo, I meant my last line about you sincerely -- I wasn't trying to be sarcastic, but I now realize it could be read that way! Sorry! :-0

      Delete
  4. "You begin by accepting small bullshit, you end up believing in big bullshit. You don’t question astrology and horoscopes, you end up believing an unscrupulous politician who tells you that we need to invade another country because it harbors invisible weapons of mass destruction..."

    Somehow that seems like a slippery-slope argument, and I've noticed that it doesn't necessarily work out in practice. Sometimes it does, and you see things like crank magnetism. Other times, well, you get people like Ken Miller and Pamela Gay, who have less than rational beliefs regarding religion, but are sharp people otherwise.

    ReplyDelete
    Replies
    1. J.J.,

      well, that's an empirical question, though there is evidence that suggests that certain types of beliefs cluster together. Yes, then there are people like Ken Miller, but remember that we are talking about a human social phenomenon, so we are bound to find all sorts of variations. At any rate, I have told Ken that he should apply the same intellectual rigor he uses to dissect Michael Behe to his own beliefs in Catholicism. He smiled.

      Delete
  5. hey massimo

    search for "rationally speaking" on youtube

    some interesting results

    ReplyDelete
    Replies
    1. You mean like the "Atheist Mad Man" entry by Nostradamus?

      Delete

Note: Only a member of this blog may post a comment.