About Rationally Speaking


Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.

Monday, November 05, 2012

Risk and blame


Ruins after the L'Aquila earthquake
http://i.telegraph.co.uk
by Ian Pollock

When six Italian scientists and civil servants were put on trial, then convicted on October 22, 2012 of multiple manslaughter (with a sentence of six years in prison each), the reaction from the scientific community was fast and for the most part negative. The defendants, who had been part of a public advisory board on safety risks, had been charged with inducing a false sense of security among the residents of L’Aquila (which had recently experienced strong tremors), causing many residents to stay at home, where approximately 300 were killed by a terrible earthquake in April 2009.

The issue is not as simple as the American Association for the Advancement of Science seemed to believe when it wrote a letter in June 2010 to the Italian President. In it, the AAAS claimed that the basis of the indictment was that the scientists failed to alert the population of L’Aquila to the impending disaster. This is not quite right — the charge was not of insufficient predictive ability, but of culpably poor risk communication. Although at a public meeting in March 2009, a large earthquake was called “unlikely,” but not excluded as a possibility, officer Bernardo de Bernardinis afterward assured the public that there was “no danger,” and that the tremors were good inasmuch as they released pent-up tectonic energy — agreeing with a reporter’s offhand suggestion that residents should crack open a bottle of wine.

This is not as misleading as it sounds, for although around half of all large quakes are preceded by minor foreshocks [P(tremor|large quake)], a minor earthquake is followed by another at least 1.0 magnitude unit larger only 1-3% of the time [P(large quake|tremor)] — which is the relevant statistic for the situation. Nonetheless, tremors are positive (if very weak) Bayesian evidence in favor of an impending larger quake: this does look like poor science and poor science communication. We do not know how many people died specifically because of the scientists’ downplaying of the risks, but it’s at least plausible that it’s a substantial fraction of the 300. (Although for some perspective, consider that no one has been charged in relation to poor building standards in the town, despite their much more proximate causal link to the deaths.)

Even granting the charge of poor science communication, the judgment of six years for manslaughter is obviously extreme, not to mention encouraging of perverse incentives. When a Type I error leads to some annoyed residents, and a Type II error leads to half a dozen years in jail, count on future risk experts to always cover their asses by overplaying every danger, no matter how unlikely. This, of course, leads to the public discounting experts, and then ignoring serious counsel they should heed. As Daniel Kahneman says in “Thinking Fast and Slow,” increased accountability is a mixed blessing. It leads to the extinction of useful signals from experts, and their replacement with weasel words.

There are lots of people in the world to feel much sorrier for, but I do often feel pretty sorry for the people in charge of communicating risk to the public and taking actions to mitigate it. In addition to the difficulty they have in predicting and preparing for the future (which obviously varies depending on what is being predicted), they face several huge cognitive problems with their audience:
  • Neglect of probability: “Don’t give me the odds! Is it safe or not?”
  • Denial of personal responsibility: “Nobody informed us of the risks of waterfall kayaking.”
  • Bad-guy bias: “My son died on the operating table, and someone needs to be held responsible!”
  • Hindsight/Outcome bias and moral luck: “How can you say it was the right call based on what you knew, when forty families are grieving!”
  • Moral grandstanding: “No risk to our children is acceptable!”
  • Crying-probable-wolf effect: “The last two evacuations were false alarms. I’m not going anywhere.”
Once you are made aware of these phenomena, you start noticing the pattern all over the place.

Kahneman relates the following anecdote:

“The worse the consequence, the greater the hindsight bias. In the case of a catastrophe, such as 9/11, we are especially ready to believe that the officials who failed to anticipate it were negligent or blind. On July 10, 2001, the Central Intelligence Agency obtained information that al-Qaeda might be planning a major attack against the United States. George Tenet, director of the CIA, brought the information not to President George W. Bush but to National Security Adviser Condoleezza Rice. When the facts later emerged, Ben Bradlee, the legendary executive editor of The Washington Post, declared, ‘It seems to me elementary that if you’ve got the story that’s going to dominate history you might as well go right to the president.’ But on July 10, no one knew — or could have known — that this tidbit of intelligence would turn out to dominate history.”

This would almost be funny if it weren’t so malicious, as here the usual blind spots are amplified by political point-scoring.

The biases affecting risk assessment and the assignation of blame for it are now well-publicized, for example by Kahneman, as well as by Dan Gardner in his excellent book “Risk.” However, we have to take this knowledge out of the realm of books on cognitive psychology and apply it wisely in real world cases. It is important to note that being made aware of the possibility of, for example, hindsight bias only reduces it very slightly. This makes me wonder if even my fairly exculpating response to the earthquake case is still too hawkish. Had no quake occurred, would I have judged de Bernardinis’ misleading statement as anything more than a minor slip of the tongue? (Other commentators fell more spectacularly for hindsight, pointing to the disastrous quake of 1703 as evidence that the group should obviously have known better.) 

I think the lesson from this is that we ought to cut the people in charge of controlling public risks, whether they are employed in the public or private sector, a great deal more slack. Our prior for culpable negligence on their part is always going to be overinflated in the aftermath of a disaster that actually did happen. We should be especially slow to judge when we or someone we know has been affected, or when our political views already predispose us against the decision makers.

One of the more powerful results of decision theory is that pretty much all decisions can be modeled as gambles. Many of those gambles have stakes that are measured in human lives — including banal decisions such as whether to go to work while you have the flu. At the level of public policy, the stakes are necessarily much higher, the uncertainty is usually greater, and alas, by its nature the gamble cannot simply be avoided.

Not only that, but just like private citizens, decision makers dealing with risk have to consider other factors besides risk to human life — such as “mere” convenience or “mere” money. Although private citizens are more than happy to grandstand about the infinite value of human life (and then go out driving in icy weather because they really feel like a cheap donair), policy decision makers are obliged to explicitly consider e.g., the economic impact of an evacuation against its potential to save lives. These are agonizing but necessary choices.

So let us think once, twice, and three more times before we demand somebody’s scalp because something really bad happened. Especially if it “could have been prevented.”

5 comments:

  1. I have to disagree. I think the current problem is too little accountability.

    This courtroom case is the exception, not the rule.

    "Nature" reports that:
    "Bernardo De Bernardinis, then deputy director of the Civil Protection Department, said, 'the scientific community tells me there is no danger because there is an ongoing discharge of energy'. Most seismologists, including several of the indicted, consider this statement to be scientifically incorrect."

    Now, saying "there is no danger" is a far cry from "cognitive problems with the audience" regarding probability. Saying "there is no danger" is not "misleading" -- it is dead wrong (as subsequent events graphically illustrated).
    It is also unscientific.

    Your point about Type I error is well taken. But, if false negatives represent the "Scylla" and false positives represent the "Charybdis", I think we can move away from "Scylla" without falling into "Charybdis". We can find a happy medium.

    I agree that the punishment in this case is too extreme, but generally speaking, I would like to see more accountability and more consequences for public officials who screw up.

    ReplyDelete
  2. Experts will not merely be careful to cover their asses. They will decline to render an opinion. Who would do so at the risk of criminal (and presumably civil) liability? Why agree to be a "public advisor" in such circumstances? You will be punished if you're wrong and receive no reward if you're right. This approach is an effective way of assuring that no sensible person will become involved in government.

    ReplyDelete
    Replies
    1. @Ciceronianus:

      Ian presents this as the dilemma: hold people accountable, and no one will dare say anything.

      I don't think it's as bad as that. I think there is a happy medium between allowing people to get off Scott-free for being irresponsible, and severely punishing some poor soul for being unable to predict the future. The "reasonable man rule" should be used to differentiate between an honest mistake that anyone could have made. . . and sheer incompetence.

      It's not is if we have to go knocking on everyone's door begging someone to take over a position of power and privilege in government. People flock to these jobs -- we can afford to be demanding. I think it is a GOOD thing for public officials to be a little worried about possible criminal and civil penalties.

      Recent history has shown that lack of accountability is the larger problem.

      Delete
    2. I'm a lawyer, and I suppose I should delight in litigation's increase. But the errors of government and its representatives, when actionable, if not covered by insurance will be paid from public funds. Holding public or private employees individually liable for negligence in the course of their employment (instead of their employers, as is now the case) would require too great a change in the law in any event, so this discussion is largely academic.

      Delete
  3. Regarding the lack of accountability, a recent federal appeals court decision has found that:

    "two American citizens cannot sue former Defense Secretary Donald Rumsfeld over allegations that they were tortured by the U.S. military in Iraq...".

    This decision creates "...immunity for all members of the military, in violation of Supreme Court precedent".

    "This new absolute immunity applies not only to former Secretary Rumsfeld but to all members of the military, including those who were literally hands-on in torturing these plaintiffs...".

    The victims were American citizens, not in the military, tortured because they made allegation of corruption, who now
    have no recourse against those who tortured them.

    This type of decision is much more common that Ian's earthquake court case.

    See http://www.reuters.com/article/2012/11/07/us-usa-court-rumsfeld-idUSBRE8A62DJ20121107

    ReplyDelete

Note: Only a member of this blog may post a comment.