About Rationally Speaking

Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.

Thursday, July 07, 2011

The fallacy of difference, in science and art

by Julia Galef

It’s not often that you find something that’s a fallacy both logically and creatively — that is, a fallacy to which both researchers and artists are susceptible. Perhaps you’re tempted to tell me I’m committing a category mistake, that artistic fields like fiction and architecture aren’t the sort of thing to which the word “fallacy” could even meaningfully be applied. An understandable objection! But let me explain myself.
I first encountered the term “fallacy of difference” in David Hackett Fischer’s excellent book, Historians’ Fallacies, in which he defines it as “a tendency to conceptualize a group in terms of its special characteristics to the exclusion of its generic characteristics.” So for instance, India’s caste system is a special characteristic of its society, and therefore scholars have been tempted to explain aspects of Indian civilization in terms of its caste system rather than in terms of its other, more generic features. The Puritans provide another case in point: “Only a small part of Puritan theology was Puritan in a special sense,” Fischer comments. “Much of it was Anglican, and more was Protestant, and most was Christian. And yet Puritanism is often identified and understood in terms of what was specially or uniquely Puritan.”
Here’s a less scholarly example from my own experience. I’ve heard several non-monogamous people complain that when they confide to a friend that they’re having relationship troubles, or that they broke up with their partner, their friends instantly blame their non-monogamy. But while non-monogamy certainly does make a relationship unusual, it’s hardly the only characteristic relevant to understanding how a relationship works, or why it doesn’t. Non-monogamous relationships are subject to the same misunderstandings, personality clashes, insecurities, careless injuries, and other common tensions that tend to plague intimate relationships. But the non-monogamy stands out, so people tend to focus on that one special characteristic, and ignore the many generic characteristics that can cause any kind of relationship to founder.
So the fallacy of difference is a fallacy of science (broadly understood as the process of investigating the world empirically) but how is it also a fallacy of art? Because artists, like scientists, are concerned with understanding the world, though their respective goals are different. While scientists want to model the world accurately in order to answer empirical questions, artists want to make us believe in the story they’re telling us or the scene they’re showing us, or to highlight some feature of the world that they find particularly beautiful or interesting, or to successfully provoke a desired reaction. That tends to require a pretty sophisticated understanding of how the world looks and acts and feels.
When they fail, it’s often the fallacy of difference at work. In novels, TV shows and movies, a flat, “one-dimensional” character is a telltale sign of a clumsy writer who focused on his character’s one or two special traits at the expense of all the generic traits common to most human beings. It’s an easy trap to fall into, because it’s such a straightforward template for creating a character: you start with one or two unique traits — “She’s the rebel!” or “He’s the funny one!” — and then whenever your character has to react to some situation you can just ask yourself, “Okay, how would a rebel react here?” or “What would a funny guy say to that?” But no one is a rebel or a clown full-time. Most of the time, they’re just a person.
Same goes for building a fictional setting, which in many comic books or movies functions a lot like a character in its own right. It’s certainly true that real cities have distinct flavors to them, just like people have distinctive personalities, so if you’re walking in Brooklyn, it feels unmistakably different from walking in Manhattan, or Baltimore, or San Francisco. There are characteristic building styles and features that define a city’s aesthetic, like Baltimore’s row houses, or Brooklyn’s brownstones. So it’s tempting to design your fictional city around some special aesthetic theme, like “futurism” or “noir.” But even the most futuristic of cities wouldn’t really only consist of sleek skyscrapers and helipads, and even the most sordid and noirish city wouldn’t really be all dark alleyways and disreputable bars. Like all cities, their special features should be offset against all the generic ones: the nondescript office buildings, bus stops, grocery stores, laundromats, and so on. (Or whatever their equivalents are, in your fictional universe.)
If you’re building a real city instead of imagining one, the fallacy of difference comes into play in a different way. Architecture is really more design than art, in that each building is supposed to provide a solution to a particular problem — e.g., “We need a place to educate our children,” or “We want an office building that encourages interdepartmental interactions.” So the temptation for architects is to focus on the special characteristics their building should have to solve that problem, at the expense of the generic characteristics that all buildings need in order to be comfortable and pleasant. Bryan Lawson’s The Language of Space contains a thoughtful discussion of this trap, though he doesn’t explicitly call it the fallacy of difference: “When architects come to design specialized buildings, such as a psychiatric unit, they tend to focus on the special factors rather than the ordinary ones,” he says. “We design lecture theaters with no windows as perfectly ergonomic machines for teaching, and then forget how unpleasant such a place might be for the student who is there for many hours, day after day.”
The fact that this fallacy pops up in creative pursuits as well as empirical ones is interesting in its own right, I think, but it’s particularly worth noting as a reminder to fight our tendency to compartmentalize what we learn. You’ve seen this before, no doubt, on a smaller scale. For example, most people who ace the logic problems on the LSAT, after leaving the classroom will blithely make the same kinds of arguments that they easily identified as fallacious when they were in “spot the fallacy” mode. But concordances spanning endeavors as seemingly dissimilar as art and science suggest the existence of even broader compartments — and the benefit of noticing them, and breaking them down.


  1. I think there's a rather different analysis you could make here drawing on the theory of communication proposed by Claude Shannon (1948), which quantifies information as the degree to which something is surprising. (Note also Shannon's theory applies equally well to the sciences as well as artistic spheres). The fallacy of difference, in his terms, might be said to exclude everything but the most "informative" (surprising) parts of the signal. While this does indeed biases things, as an approximation it's pretty good since you're picking out the most informative part of the signal. Still, you're just looking at a signal here not the real thing, so maybe this just reduces model vs. reality confusion people are so susceptible to.

  2. Your essay mirrors findings in developmental psychology. I've been reading essays in The Cambridge Handbook of Expertise and Expert Performance and noticed an important current in the field: the relevance of abstraction. High performers tend to think more abstractly about their work than do weaker performers. First-rate software engineers, for example, tend to think first and foremost about the purpose of their software and about broad-scale design issues, while lesser engineers tend to dive into coding too quickly and take a more aimless approach, trusting that they'll reach their goal eventually.

    This also goes along with the arguments of Thomas & Turner in their writing manual Clear and Simple as the Truth. In the introduction, they say that "Neither conversation nor writing can be learned merely by acquiring verbal skills, and any attempt to teach writing by teaching writing skills detached from underlying conceptual issues is doomed." Emphasis on "conceptual issues"; the authors emphasize that a writing style is defined by its stands on truth, writer, reader, thought, language, and their relationships--not things like comma usage, dialect, or word choice. The value of their thinking is apparent in their presentation: it is the only guide to prose writing I have myself found to be well-written and enjoyable.

  3. If I understand 'fallacy of difference' correctly (I have not read the book), it seems to be what underlies regionalism, racism, partisanism, etc. It seems to describe what has happened in the U.S. in the 2000's with apparent polarization in politics, and reinforcement by the media can exacerbate the problem.

  4. I do not have much to say about the post's thesis, but MIT's Stata Center is a visual monstrocity and its artificial randomness and designed disorder give me nightmares.

  5. “Only a small part of Puritan theology was Puritan in a special sense,” Fischer comments. “Much of it was Anglican, and more was Protestant, and most was Christian. And yet Puritanism is often identified and understood in terms of what was specially or uniquely Puritan.”

    Weeeell... seeing as how they originated in a society that was predominantly Christian and specifically Anglican, it is a good bet that they themselves defined themselves mostly on the base of what made them different from the generic Christian, and most conflicts with Anglicans would grow out of those special characteristics. So in this case focusing on their special characteristics may not be a fallacy, but the most reasonable approach. Don't commit the fallacy fallacy!

    Because artists, like scientists, are concerned with understanding the world

    Really? Those who actually want to paint or sculpt naturalistically cannot help but trying to understand nature, simply to emulate it better. But I am entirely unconvinced that modern art is not simply the art of bullshitting your audience into thinking you are oh so deep so that you earn some money off them. The same goes for fancy architecture, although admittedly they at least have to make sure that the building does not collapse.

    I think this idea works best the way you apply it to poor storytelling.

  6. A pet peeve of mine is repeated appearance of this fallacy in the analysis of politics. Many otherwise well-informed people insist on explaining the outcome of each presidential election in terms of its unique features - "Bush broke his no new taxes promise," "Dukakis was too unemotional when asked about the hypothetical murder and rape of his wife," "Johnson successfully scaremongered the nation about Goldwater." These explanations are not false, but they are superfluous. Election outcomes can be predicted quite accurately in terms of their common features by, for example, Douglas Hibbs' Bread and Peace Model. In particular, the elections of 1992, 1988, and 1964 I mentioned above all lie very close to the trend.

  7. This fallacy often leads to a fallacy of division as well. If "Puritanism" is X then every Puritan must be X...

  8. Massimo, it's very strange to me to think of flat characters as being defined by difference. To me the flatness of a character is roughly proportional to the degree to which that character resembles other characters of the same type. "The rebel" and "the funny one," in other words, are dominated by the generic aspects of their characters. Perhaps you're really thinking not of stock characters, but of caricatures. That would make a lot more sense to me. Caricatures are certainly dominated by the most distinctive features of their subjects -- suddenly I'm thinking of Cyrano de Bergerac's nose.

    But then I would have to argue that caricature is one of the cornerstones of art! Certainly if your aim is realism, then you might want to avoid obvious caricatures, but (first) not all good art aims for realism, and (second) even realist art needs to strike an appropriate balance between the generic and the characteristic.

    I think the second point is important, but I want to emphasize the first here. Consider Candide -- I don't think Voltaire's aim was realism. Candide is a wonderful work of fiction, and it's full of caricatures. The same could be said for Gulliver's Travels, Hard Times, Huckleberry Finn... and so on. Caricature is so closely related to the way we perceive the world that art cannot easily eschew it; consider, for example, the results that pop up for a google search for caricature and face recognition.

    This is not to say that caricature cannot be abused or used awkwardly in works of art; but I think its use is not automatically an aesthetic fallacy. I would, perhaps, turn the argument around, and say that the problem arises when caricatures are misread and mistaken for veridical portraits.

  9. Julia, apologies for calling you Massimo. I misinterpreted the "posted by Massimo Pigliucci" line at the bottom of the post as a byline.

  10. I wonder how much of this fallacy can be attributed to a simple neglect of detail. That is, perhaps we can't be bothered to imagine the many more ordinary features of something with an extraordinary feature.

  11. I don't think that this is a fallacy at all, in the sense that fallacy = always false. If X = a & b & !c, and Y = !a & b & c, then we can understand X either in terms of its commonality with Y, or its difference from Y. If we think we already understand the commonality, then the most interesting feature is its difference.

    So the fallacy appears to be false, itself.

  12. @Stan - here I think the fallacy is the how the obvious or glaring differences tend to mask the 'differences that make a difference' in a historical analysis or the point of work of art.

    Using your example and the sociological one above, I read this post as a complaint that people from culture X(EverybodyKnowsA, b ,!c) explain culture Y(!EverybodyKnowsA, b ,c) by focusing on EverybodyKnowsA instead of EverybodyKnowsA and 'c' equally or perhaps properties 'd' and 'e' that are not measurable in culture X.

  13. Reminds me of this David Simon interview:


    Salient quote:

    "My standard for verisimilitude is simple and I came to it when I started to write prose narrative: fuck the average reader. I was always told to write for the average reader in my newspaper life. The average reader, as they meant it, was some suburban white subscriber with two-point-whatever kids and three-point-whatever cars and a dog and a cat and lawn furniture. He knows nothing and he needs everything explained to him right away, so that exposition becomes this incredible, story-killing burden. Fuck him. Fuck him to hell."

  14. Because artists, like scientists, are concerned with understanding the world,

    I disagree. I think artists do what they do in order to express something they feel needs to be expressed. That 'something' need not be anything leading to understanding and need not concern our world at all.

    This is aside from the rather thorny problem of trying to demarcate the boundary between what is art and what is not. Perhaps Massimo will have a go at that in his next book having gained a proficiency in cleaving after dividing science from nonsense.

  15. Another example is that when a non-religious person is having some sort of emotional difficulty, a religious person will often say it's because "you don't have God in your life".

  16. "in the sense that fallacy = always false"

    A fallacy means the reasoning is incorrect, not that the conclusion is false. It's essentially the same as the "post hoc" fallacy.

  17. "But I am entirely unconvinced that modern art is not simply the art of bullshitting your audience into thinking you are oh so deep so that you earn some money off them."

    I just saw the movie "Exit through the giftshop," in which the ending makes this point nicely. Although I have to say that its unfair to paint with such a wide brush... when art overlaps fashion/trends/$$ it can get pretty bad.

    Just like music has its "top 40," and literature has crappy novels, other forms of art have its low brow form for the casual consumer

  18. "monstrocity" is a good coinage.


Note: Only a member of this blog may post a comment.