I’m at a conference organized by the Science Communication Project at Iowa State University, entitled Between Scientists and Citizens: Assessing Expertise in Policy Controversies. I will be blogging from some of the sessions, to give you a flavor of what the conference is about (there are four parallel sessions, and unfortunately I haven’t developed the ability to be in multiple places simultaneously — working on it, though!). I will be giving one of two keynote talks at the conference (my title: Nonsense on stilts about science, field adventures of a scientist-philosopher — I will post it over at PlatoFootnote soon).
So let’s get started with Deserai Crow and Richard Stevens (University of Colorado-Boulder) on “Framing science: The influence of expertise and ambiguity in media coverage.” Recent studies suggest that Americans are increasingly interested in but also increasingly uninformed about science. They naturally get their science information largely from the media, and media organizations treat science coverage as a niche or beat subject. Interestingly, only about 3% of journalists covering science actually have a science or math background, while most major in communication fields.
Expertise is often thought of in terms of skills, but within the context of science communication it really refers to authority and credibility. Expertise is communicated at least in part through the use of jargon, with which of course most journalists are not familiar. Jargon provides an air of authority, but at the same time the concepts referred to become inaccessible to non-specialists. Interestingly, journalists prefer sources that limit the use of jargon, but they themselves deploy jargon to demonstrate scientific proficiency.
The authors suggest that there are two publics for science communication, one that is liberal, educated and with a number of resources at its disposals; the other with less predictable and less-formed opinions. The authors explored empirically (via a survey of 108 Colorado citizens) the responses of liberal and educated people to scientific jargon by exposing them to two “treatments”: jargon-laden vs lay terminology news articles. The results found that scientists were considered the most credible sources in the specific area of environmental science (94.3% agreed), followed by activists (61.1%). The least credible were industry representatives, clergy and celebrities. (Remember, this is among liberal educated people.) Interestingly, the use of jargon per se did not increase acceptance of the news source or of the content of the story. So the presence of scientific expertise is important, not so the presence of actual scientific details in the story.
It is highly unfortunate that Crow and Stevens didn’t present the same survey on the second type of public they identified in their preliminary results. Apparently that part of the study is being carried out now.
The next talk I attended was “Reason, values and evidence: Rational dissent from scientific expertise,” by Bruce Glymour and Scott Tanona (Kansas State University). Widespread public rejection of scientific consensus in the US is often declared to be “irrational” (for instance in books by Chris Mooney). But in fact sometimes rejection of scientific claims is not irrational. Science denial can be a rational response to information which, if accepted, would induce a conflict in core values. The idea is that values underwrite all notions of rationality, but there is no theory of rationality to decide fundamental values. Indeed, trust in logic, rational choice and science can themselves be understood as values.
Consider a decision of whether to carry an umbrella with you given a certain probability of rain. Different people will fill the corresponding decision matrix differently (depending, for instance, on how much they dislike carrying umbrellas around, or getting wet, and so on). It’s not at all the case that there is one rational way to construct the matrix.
Or take logic itself. It is well known that there are situations in which different types of logic do not fare very well (propositional logic, for instance, doesn’t deal well with Sorites paradoxes). And of course there are a variety of types of logic, and it makes no sense to ask which one is the best. It depends on what you wish to use it for.
Same goes for the scientific method. There is no complete account of the scientific method, and again one can choose certain methods rather than others, depending on what one is trying to accomplish (a choice that is itself informed by one’s values). And of course the Duhem-Quine thesis shows that there is no straightforward way to falsify scientific theories (contra Popper).
If there were supernatural causes that interact with (or override) the causes being studied by science, but are themselves undiscoverable, this would lead to false conclusions and bad predictions. Which means that the truth is discoverable empirically only if such supernatural causes are not active. Science cannot answer the question of whether such factors are present, which raises the question of whether we ought to proceed as if they were not (i.e., methodological naturalism).
Is methodological naturalism wishful thinking (since it is not empirically verifiable)? If one’s primary goal is to discover truths about the world that support reliable predictions, than methodological naturalism is rational. But it can be rational to believe without evidence, or even against the evidence, again depending on what your goals are.
The best theories of rationality are instrumental. No theory prescribes core values and goals, but theories can give prescriptions for reaching goals. Such theories include instrumental values. What happens often — both in the case of science and in that of moral dilemmas — is that one’s several values come into conflict. A typical response is to deny the facts, which satisfies yet another value, the desire not to prioritize between values.
Authors suggest that the best one can do is to engage in an exercise of reflective equilibrium, which however itself cannot tell you which values are more important than others.
The last talk of the first morning was about “Expertise, equivocation, and eugenics,” by John Jackson (University of Colorado-Boulder). The author began by pointing out that historians of science are frustrated by the kind of abstract and formalized models of science developed by philosophers; the latter, however, are frustrated by historians’ detailed contextualization of science that seems to miss the general picture. He asked whether rhetorical argumentation or informal logic can provide a way to bridge the two.
Consider the terms “fit” and “fitness” in evolutionary biology. T.H. Huxley famously gave a technical definition of fitness within the theory of natural selection, though the term was borrowed from previous informal usage in lay language, where it means to be in good physical or mental shape. For eugenicists, the problem was of the survival of the unfit, so to speak, which of course would be oxymoronic if one uses the term “fit” in the technical sense. According to eugenicist Arthur Balfour, “the feeble-minded” were getting better adapted to their (social) environment, and that had to be changed by government intervention.
The author suggests that from a philosophical standpoint the problem here is simply caused by a fallacy of equivocation, switching back and forth between the technical and the vernacular meanings of “fit.” Charles Reed, another eugenicist, was also equivocating, using the term “fit” in the scientific sense when explaining the problem (claiming the mantle of Darwin for the cause), but switching to the vernacular sense when proposing social policies (to generate certain political and social overtones).
But from a historian’s perspective, eugenics was a scientific research program, a social movement, and a legislative agenda, all rolled into one. For eugenicists the political order was a product of biological race, so that to speak of political institutions was to speak of heredity and vice versa. By the end of the talk, however, I felt like more development of the idea of reconciling the philosophical and historical accounts was needed.
And so we get to the afternoon session, beginning with “The ambiguous relationship between expertise and authority,” by Moira Kloster (University of the Fraser Valley). [Unfortunately, this talk was without slides, and since it was after lunch, I paid less attention to what the speaker was saying...] The author talked about a class she teaches where students enact different roles related to expertise and authority (e.g., a doctor who advises about a cataract operation, a friend who has actually gone through such operation, etc.). The point of the exercise was to explore the idea that expert advise is insufficient to reach a decision unless one has also had occasion to reflect on what one values about the problem concerning which the expert is giving advice.
The author asks whether, for instance, a nutritionist — qua expert — has the ability to enforce a better diet in a number of particular situations. The answer is no: in a hospital context, things would also depend on, for instance, the costs associated with different recommendations; in a political context (e.g., about vending machines in schools) there will be issues of cost as well as public reaction and so forth. So the expert’s authority will need to be negotiated in a broader context than just his particular area of expertise.
The suggestion is to bring in a different kind of expert, similar to a business coach (who does not have expertise in business, but coaches CEOs about decision making and communication). This would be, then, an individual whose role would be to advice people on how to make decisions, including taking into consideration the advice of experts.
Final talk of the day (well, before my keynote): “The ethos of expertise: How social conservatives use scientific rhetoric,” by Jamie McAfee (Iowa State). The paper [no slides!] focused on James Dobson’s Focus on the Family organization. Dobson is apparently well known for the use of “therapeutic rhetoric” as a base from which to articulate a conservative worldview.
The author based his analysis of Dobson’s influence on cultural theorists Ernesto Laclau and Chantal Mouffle’s Hegemony and Socialist Strategy as well as on Harry Collins and Robert Evans’ Re-Thinking Expertise, which attempts to describe legitimate expertise and categorizes different kinds of expertise. [I must admit that I am deeply skeptical of Collins’ work, which I find at times bordering on incoherence, like much radical sociology of science. I’m not too keen on post-modern cultural theorizing either, but I have not read Laclau and Mouffle.]
All in all, McAfee claims that Dobson has turned his “expertise” (as a therapist) into political capital, and has given himself permission to explicitly import his ideology (fundamentalist Christianity) into his role as an expert. [Yes, though we may begin by questioning in what sense Dobson is an expert on anything at all, but that would require us to step outside the postmodern / radical sociology framework.]
Well, that’s all for the first day, folks. Part II and conclusion coming soon...