About Rationally Speaking


Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.

Friday, May 25, 2007

Truth: a Guide

It's the title of a nice little book by philosopher Simon Blackburn (who has a penchant for concise titles: he also wrote “Lust” and “Think,” among others). It's more technical than I expected, but some of his points should be relevant to anyone interested in the demise of simplistic notions like the correspondence theory of truth, or the rise of postmodernism and “your opinion is as good as anyone else's” sort of nonsense.

Blackburn is as fair minded (especially about postmodernism) as one can possible be, which makes this book all the more relevant to the ongoing cultural wars. On p. 113, the author provides a useful little summary of the major philosophical positions about truth, in the form of a table with four corners.

The upper left corner is occupied by “eliminativism,” the rather radical idea that one simply shouldn't think in terms of truth at all, because the concept is meaningless. Blackburn makes the parallel with astrology: we are wasting time if we debate about how to best divine the stars' influence on human life, because there is no such thing as a constellation (they are optical illusions). Similarly, an eliminativist about truth thinks the whole idea is misguided and cuts off the discussion before it gets started. And, as Blackburn notes, eliminativism isn't the same thing as skepticism (the idea that there is truth out there, but we simply can't reach it), because the eliminativist doesn't regret that we can't have truth, just like the astronomer doesn't feel bad about telling people that constellations are figments of their imagination. But somehow eliminativism feels more like a cop out than a serious philosophical position, as we give up much too much if we think of truth the same way as we think of constellations.

The upper right corner of Blackburn's classificatory scheme of theories of truth is occupied by realism, the position that yes, indeed, there is such thing as truth, and yes, we can say something, in fact, a lot, about it. Typically, scientists tend to be realists, and realists are generally optimistic about science. The problem with at least some naïve versions of realism (so-called “real realism”) is that there is no coherent account of it. For example, a typical realist account of truth is the above-mentioned “correspondence theory,” whereby something is true if it in fact corresponds to the way things really are. But the objection is that we can't say in which way things “really” are because anything we say about the universe is bound to be affected (and distorted) by our own point of view, and the correspondence theory seems to require a “god's eye view” of things in which, clearly, we do not partake. (This is why Kant, for example, concluded that we have no access to “the thing in itself,” but only to the world of sensation, and even that is biased by our innate “categories” of space, time, causality, etc.)

The lower right corner of Blackburn's table (btw, I believe the positions of the various schools of thought within the table are arbitrary) is “quietism.” Here lies deconstructionism, whose fundamental tenet is that nobody can provide a theory of truth, because there is no neutral viewpoint one can adopt to stand outside personal or local truths (the above-mentioned lethal objection to real realism). This, however, quickly leads to what Blackburn labels “soggy pluralism,” or, as he puts it: “some [philosophical] problems are disquieting enough to prompt the thought that you can ignore them [as the quietist would want us to do] only by feigning general paralysis of the brain.” Funny, that's often how I feel about deconstructionists and postmodernists.

Finally, we get to the lower left corner, what Blackburn labels “constructivism.” The idea is in between realism and quietism: constructivists would disagree that “truth” means the objective representation of an independent reality, but also disagree with quietists because for a constructivist there are worthwhile theories of truth, and they do some kind of work, for example they might give us models that serve as useful fictions to navigate the world (pragmatists, for instance, are included by Blackburn under the umbrella of constructivism).

In the end, Blackburn finds something interesting everywhere he looks, but also a lot to be discarded in the various philosophical theories of truth. While he leans toward some sort of realism, he is not a hard core “real realist” because he appreciates the force of the basic deconstructionist critique, the fact that human beings simply cannot avoid adopting a non-neutral point of view, and that this impinges on their view of the world, making it inevitably partial. But he also rejects much nonsense that one hears these days about alternative truths: “There may be rhetoric about the socially constructed nature of Western science, but whenever it matters, there is no alternative. There are no specifically Hindu or Taoist designs for mobile phones, faxes or television. There are no satellites based on feminist alternatives to quantum theory. Even the great public sceptic about the value of science, Prince Charles, never flies a helicopter burning homeopathically diluted petrol, that is, water with only a memory of benzine molecules, maintained by a schedule derived from reading tea leaves, and navigated by a crystal ball.”

Wow, talk about not pulling epistemological punches. And it's an excellent argument for getting rid of the monarchy too.

Monday, May 21, 2007

Acceptance of science is made difficult by innate psychological biases

Critical thinking is hard stuff, as anyone who has taken a course in it will readily testify. Formal logic is counterintuitive, and it requires effort to master it. Most modern science is also counterintuitive and difficult, which is why 42% of Americans believe that evolution never happened.

An article by Paul Bloom and Deena Weisberg in Science (18 May 2007) is crucially relevant to anybody interested in skepticism or science education. There is now empirical evidence explaining why so many otherwise intelligent people can go on their whole lives rejecting fundamental notions of modern science, from evolution to Newtonian mechanics (yes, it turns out that, intuitively, we adopt Aristotelian physics, for example believing that a ball coming out of a curve tube will keep going along a curved trajectory – it doesn't, it goes straight).

According to Bloom and Weisberg, one main source of resistance of adults to novel scientific findings is that as children they were not blank slates ready to absorb scientific notions: our brain comes with a series of built-in “assumptions” about the world, presumably to help us navigate it without the need for formal education. For example, several experiments have demonstrated that even very young children know that objects will fall down if not sustained (an intuitive theory of gravity), and they already have a concept of causality (as Kant first suggested, when he said that certain “categories,” such as space, time and causality, are naturally imposed by the human mind before any experience comes to shape us).

One of the consequences of this built-in view of the world is that children often extrapolate from it to derive wrong conclusions about the physical universe: for example, they derive a flat-earth “theory” from their Aristotelian physics, and it takes scientific education to dissuade them of that notion.

Even more interestingly, children are born with a tendency to see agency everywhere, including in inanimate objects: everything has a purpose, it is “for” something, a phenomenon psychologists call “promiscuous teleology.” It isn't difficult to see how this readily accounts for both the widespread tendency to believe in the supernatural (remember that the first religious beliefs where of a pantheistic type, where all of nature was infused with purpose and agency), as well as the widespread acceptance of a dualistic theory of mind, where somehow the mind (and, therefore, the “soul”) is independent of the physical brain and can survive the latter's demise.

A second reason for the difficulty in accepting counterintuitive scientific notions is that children (and, later, adults) believe in certain kinds of authorities and are sensitive to the cultural context within which ideas are presented. So, few people today doubt that the earth is round because there is no societal controversy about the fact (though probably few people would be able to point out exactly how we do know that the earth is not flat – short of direct observation from an artificial satellite). Evolution, on the other hand, is controversial not only because it is counterintuitive, but because authority figures that are important early on in our childhood (parents, preachers) are so often vehemently opposed to it. Moreover, the very idea of evolution is cast in terms of “belief”: think of the difference between people saying that they believe – or not – in evolution, while nobody “believes” in the round earth theory, because it's an incontrovertible, societally endorsed, fact.

The upshot of all this is that science educators and skeptics have an uphill battle to fight: they have to somehow overcome both innate psychological biases and cultural entrenchment. It's a tough and largely thankless job, but all the more important when major policy decisions affecting our welfare depend on it, from global warming to stem cell research. As a recently published cartoon suggested in jest, if you don't accept evolution perhaps you should have the coherence of not asking your doctor for the latest vaccine. After all, the vaccine is the product of our understanding that viruses evolve, just in the same way that you are able to fly across the planet because it is, in fact, not flat...

Friday, May 18, 2007

Freud and Russell against god and other insanities

After a long hiatus I'm finally about to finish Jennifer Michael Hecht's monumental “Doubt: a History,” a must read for anybody seriously interested in skepticism. Toward the end of the book I encountered again some interesting insights from two of the most influential thinkers of the 20th century: Sigmund Freud and Bertrand Russell. Let me remind you of them as well.

Freud's opinion of religion was that “there is no distinctively religious need – only psychological need,” which indeed goes a long way toward answering the perennial question of why it is that so many people fall for religious crap. Deep-seated psychological needs, such as overcoming the terror of annihilation that comes from being conscious of one's own mortality, can get the better of even the most educated and intelligent human being.

Even better, Freud's “The Future of an Illusion” suggests that religion isn't just an error, it is a willful error. As such, the oft-heard remark that since we can't know for sure “we might as well” believe in god is bogus. As Freud put it: “If ever there was a case of a lame excuse we have it here. Ignorance is ignorance; no right to believe anything can be derived from it.”

As for Russell – whose “Why I am Not a Christian” was exceedingly influential on my own path from Catholicism to agnosticism and eventually to atheism – clearly saw that the problem isn't just religion, it's any dogmatic ideology. He knew from history what religion can do to humanity, but he saw during his own lifetime what Nazism, Fascism and Communism were capable of as well, and it wasn't a pretty picture.

Interestingly, in a quasi-psychoanalytical fashion, Russell also understood what the connection was: “I admit at once that the new systems of dogma, such as those of the Nazis and the Communists, are even worse than the old systems, but they could never have acquired a hold over men's minds if orthodox dogmatic habits had not been instilled in youth. ... [Stalin's language] is full of reminiscences of the theological seminary.”

Which is why he concluded that “I do not believe that a decay of dogmatic belief can do anything but good.” Amen.

Wednesday, May 16, 2007

Atheist cruise and philosophical conversations

All,

I have agreed to be the "philosopher in residence" during a 5-night cruise organized by New York Atheists for the end of August. The cruise will be aboard the Carnival ship Victory, and will depart New York harbor in the afternoon of Saturday, August 25.

The itinerary includes calls at St. John/Bay of Fundy and Halifax, Nova Scotia, plus a couple of "fun days at sea." During the latter, we will get together to have conversations about philosophy, humanism, politics and all the rest -- helped by the beautiful scenery and afternoon cocktails.

If anyone is interested, please contact Gloria Kingsley (Gloria@SolarTravelinc.com or 212-601-2775), possibly before the end of May or early June.

Chomsky the anarcho-libertarian

I don't know what you think of Noam Chomsky, but you probably do have an opinion, or should. He is by some reckoning (for example by the New York Times) the most influential intellectual alive. I have read one of Chomsky's works, Manufacturing Consent (co-authored by Edward Herman), and of course I was aware of his theory of an innate grammar that allows human beings to learn languages during the early stages of their development. One of the things that always endeared Chomsky to me was his thorough debunking of Skinnerian behaviorism, which opened the way to modern cognitive science.

But I had never seen Chomsky in action, a lacuna that was remedied at least partially during the last couple of days, when I watched the documentary “Manufacturing Consent: Noam Chomsky and the Media.” I'm not easily given to hero-worship, and in fact I'm pretty sure that Chomsky himself would be horrified at the prospect, but I must admit that I quickly adopted a new role model for my own modest forays into public intellectualism.

It isn't that I agree with everything Chomsky says. I find his political positions admirable, but it seems to me that he simply doesn't take into sufficient account the nature of being human (ironic, from someone whose main academic mark is an innate theory of language). I don't even necessarily agree with all his political positions (although I would defend freedom of speech regardless of whose speech is to be defended, his entanglement with the Faurisson affair and Holocaust denial was, I think, a bit naïve).

Nonetheless, Chomsky indubitably has the conviction of his ideas (he was arrested several times during the Vietnam era protests), does the hard work of researching what he says, is capable of brilliantly articulating his visions, and maintains an incredibly calm demeanor whenever challenged in public. If that's not the exact picture of what a public intellectual should be, I don't know what is.

Chomsky has almost single-handedly made the American public (or at least, the portion who cares to listen) aware of the genocide perpetrated in East Timor beginning in the mid-1970s, at the direct hands of the Indonesian government and with full support (including the shipment of weapons) of the US government. Moreover, Chomsky sharply exposed the hypocrisy of both the American government and media in decrying the genocide perpetrated by Pol Pot and the Khmer Rouge in Cambodia, while being respectively complicit in and utterly silent about the parallel events that were unfolding in East Timor. It isn't that Chomsky was condoning Pol Pot, it is that he was pointing out what every thinking person ought to know by now: the US government talks the talk of democracy at home and abroad, but walks a path perilously close to fascism and colonialism whenever it can get away with it.

Indeed, in the documentary, Chomsky makes the very apt parallel between the US and that ancient world paragon of democracy: Athens. Yes, Athenian society was by far the most open society in the world at the time, but it was plagued by both internal injustice (slavery, lack of women rights) and external aggressiveness (culminating in the disastrous Peloponnesian war against Sparta). Similarly, the United States is, relatively speaking, a great place to live today (though several European nations actually do better by a variety of civil libertarian standards), but it is still plagued by injustice (poverty, limited gay rights) and perennially involved in warfare (the history of the United States is characterized by a remarkable sequence of external aggressions, which rarely – if ever – resulted in “spreading democracy” abroad).

In the documentary, Chomsky says that what needs to be done is to provide people with “a course in intellectual self-defense.” In our society people are isolated from each other, getting their news from a small number of media outlets increasingly controlled by an even smaller number of international corporations, an ideal ground for apathy and social stagnation. Which, of course, is exactly the way the powers that be want it (Chomsky has been accused of being a conspiracy theorist, but it seems to me that one doesn't need to imagine actual meetings in dark smoky rooms to realize what the concentration of economic and informational power is doing to our society). What is required is for people to get out, join organizations, read alternative media, and most importantly discuss things – all part of building intellectual self-defense tools. Enlightenment comes from confronting ideas with others, engaging in a continuous feedback (as David Hume put it, “truth springs from argument amongst friends”). As Chomsky says, “sure, the other stuff [meaning information not filtered by the major media] is there, but you have to find it,” and it is unreasonable to ask everyone to go home after a long day at work and suddenly turn into investigative reporters attempting to figure out how things really are.

As I said, I'm not sure I'm prepared to go all the way with Chomsky as far as his view of what a good society would be like. I guess I'm a bit too timid to embrace his idea of “social libertarianism,” or “anarcho-syndicalism,” but the documentary I saw was the first time the word “libertarian” didn't prompt me to reach for the gun that I don't own. And that's saying a lot.

I'll give Chomsky the last word: “In this possibly terminal phase of human existence, democracy and freedom are more than just ideals to be valued - they may be essential to survival.”

Friday, May 11, 2007

On free will, once again

Free will is one of those ever-popular topics in philosophy, where everyone seems to have opinions that are as strong as they tend to be unsubstantiated or confused. It's hard even to wrap one's mind around what exactly, or even approximately, one might possibly mean by “free” will. Free from what?

Well, if you haven't had enough of it yet, may I suggest a handy-dandy summary of the contributions that science is beginning to make to the free will debate? I published such an essay in the latest issue of Skeptical Inquirer, which is actually a commentary on a fascinating New York Times article by Dennis Overbye.

As you'll find out in the SI essay, I think Daniel Dennett is the philosopher who's got the most interesting things to say on the topic, especially in his Elbow Room. But science has begun to butt in, starting with the classic experiments by Libet in the 1970s, demonstrating that our subconscious makes decisions significantly ahead of our conscious awareness of them, a pretty scary thought in and of itself (I mean, who, exactly, is in charge here?).

You will also find out why I think that any talk of quantum mechanics in relation to the source of free will is nonsense on stilts, and should be avoided at all costs. On the other hand, I suggest in the essay that there is a legitimate role for so-called emergent properties to play a role in consciousness and, therefore (?) in free will, once we agree on a non-mystical conceptualization of emergence.

Monday, May 07, 2007

Yet another nail in the Intelligent Design coffin

The transition from swimming to walking, which happened around 385 million years ago and was one of the pivotal moments in animal evolution, has been somewhat of a mystery for evolutionary biologists, thereby offering the usual (and trite) opening to creationists, seemingly oblivious to the fact that their positions are based on nothing more than an argument from ignorance (often their own).

But as it is often the case in science, new research suddenly throws light on an old mystery. It is what happened recently with the publication of a paper by Auke Ijspeert and colleagues at the Swiss Federal Institute of Technology in Lausanne (Nature, 9 March 2007). The group has built a robotic model of a salamander to test a daring hypothesis about the neurological basis of the switch from swimming to walking that was a necessary part of the evolutionary transition from sea to land.

The researchers focused on a particular neural network, the central pattern generator, that causes rhythmic muscle movement along the body when activated. In lampreys and salamanders this triggers waves of body contractions, causing the animal to swim. The neat piece of the puzzle here is that lampreys (which don't walk on land) only have one such network, while amphibians (which can walk) have a second one in charge of limb movement.

In a previous study, Jean-Marie Cabelguen's group at the University of Bordeaux located the region of the midbrain of salamanders that triggers neuronal firing in both networks, and discovered that the intensity of the stimulation is a direct predictor of the animal's behavior: when the networks are firing at low intensity, the salamander walks; turn up the dial, and it walks faster; turn it even more and some of the nerve cells shut down, the walking stops, and the body begins undulating movements appropriate for swimming!

The group led by Ijspeert then developed a mathematical model of the process, and built a robot, aptly nicknamed Salamandra robotica (see photo), to test it. It worked beautifully, precisely mimicking the behavior of the real thing.

Of course, creationists will entirely miss the point, arguing for example that Salamandra was “intelligently designed” by human beings – thereby displaying a complete misunderstanding of the workings of science (all scientific experiments are “intelligently designed,” but that doesn't mean they can't tell us anything about nature). Alternatively, they will complain that other changes must have happened as well during the time of the transition from sea to land, for example the ability to withstand longer and longer periods away from water. Again, this misses two crucial aspects of evolutionary theory: it does not require that all changes have to happen simultaneously, while it does predict the existence of variation in the characteristics and behaviors of living organism, enough, for example, to insure that some proto-amphibians were better than others at dealing with the stress imposed by the novel environment.

Then again, as we all know, creationism and so-called intelligent design aren't really about reason and evidence. There is no scientific controversy about evolution in the scientific community, there is only people who don't understand science or who put their faith ahead of any possible fact. The funny thing about that, however, is that the same people then turn around and wish to use reason to back up their faith. Could it be that, deep down, they are actually insecure about their simplistic worldview?

Wednesday, May 02, 2007

Between the Scylla of moral absolutism and the Charybdis of moral relativism

The other night I gave a presentation on science and religion to the Cafe Scientifique in New York City, using Richard Dawkins' book, The God Delusion, as a starting point for the discussion. After the event, I had dinner with the organizers and some of the attendees. Most of the dinner conversation with one of my table neighbors was about the opposite evils (I'm using the term loosely here) of moral absolutism and moral relativism.

Absolutism, in this context, is the idea that there is only one set of moral precepts, it is universal, and it applies everywhere, to everyone, and under any circumstance. It is the sort of idea that has bred 19th century colonialism and 20th century fascism and communism. Not a pretty sight to behold.

Moral relativism, as applied to different cultures, is much more recent, being mostly a late 20th century phenomenon. But it isn't much less pernicious than its antipodal predecessor: the basic idea is that “anything goes,” any cultural practice, no matter how repellent (think genital mutilation), has to be respected because, you know, who are we to think ourselves superior to other people?

Well, let us make no mistakes about it: a culture that (at least as an ideal) respects people's freedom of speech, strives to give women and minorities equal rights, and minimizes physical harm and emotional pain for its members is indeed superior to most alternatives produced by humanity in millennia of history, in most places in the world.

How arrogant, you may say. Not at all. My claim simply derives from the Aristotelian observation that human beings – if given a chance – want to be able to pursue whatever it is that allows them to flourish, and that usually boils down to freedom of action and thought, and avoidance of pain and suffering. It's that simple, and anyone seriously doubting this is not well acquainted with the basics of human nature. So, yes, modern cultures that subjugate women and practice genital mutilation or infanticide are barbaric, and it is morally compelling for the rest of us to help them out of that sorry state of affairs.

That said, however, one needs to steer very much clear of the opposite pole, attempting to impose a rigid and universal standard forged out of the idiosyncrasies of one's own (usually Western) cultural history. So, for example, it is sheer nonsense to talk about immorality when it comes to the varieties of sexual behavior among consenting adults, pre-marital, after-marital or instead-of-marital, as the case may be.

And it is this twin problem, this ethical version of the quintessential dilemma personified by the ancient mythical monsters of Scylla and Charybdis, that faces modern open societies. The very idea of an open society means that we ought to (as in morally should) be tolerant of different viewpoints and customs. But we cannot be tolerant of intolerance. We cannot work toward a society of equal rights while at the same time welcoming people who actively deny rights to women and minorities because “it is their culture.” It may be their culture, but it is wrong, and shame on us if we don't have the guts to call it as we see it (and as it really is).

(Note: the original post had the word "multiculturalism" instead of "moral relativism," but several readers have pointed out that the latter is really a more appropriate term for my target here.)