About Rationally Speaking


Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.

Monday, September 26, 2011

The value of academic scholarship

by Massimo Pigliucci

How do academics justify to society what they are doing and why they should be paid for it? It’s a question that has been seriously relevant only beginning with the 20th century and the rise of the professional academic, in both the sciences and the humanities (and especially after WWII for the sciences, which have seen a huge increase in their share of university and national budgets). Before then, much scholarship was done outside universities, and even within them it really didn’t cost much and was often paid for by a prince or other patron for their own amusement and aggrandizement. David Hume, one of the most influential modern philosophers, never held an academic post, and neither did Darwin. (Again, there are exceptions — just think of Kant and Newton — but that’s what they were, exceptions when compared to the modern version of the academy.)

Indeed, the very idea of a “university” got started in Europe in the 11th century (the first one on record was in Bologna, Italy, quickly followed by Paris, Montpellier, Oxford and Cambridge), and the term initially referred to a guild of itinerant teachers, not a place. By the end of the Middle Ages, however, it started to occur to various municipalities that it was good for business to attract the best teachers on the market, and that a relatively cheap way of doing so was to offer them shelter — both in physical form as a place in which to teach and study and in the more crucial one of (some) protection from Church authorities and their persecution mania (believe it or not, Thomas Aquinas’ writings were considered too hot for public consumption for many years).

Particularly because the phenomenon is so very recent, the question of why we should finance sciences and humanities with university posts and federal grants is a good one, and should not be brushed aside by academics. The latter attitude is too often the case, for instance whenever my colleagues in the sciences tell people that “basic research leads to important applications in unforeseeable ways,” or that whatever they happen to be doing is “intrinsically interesting.”

Let’s start with the first response. Yes, it is easy to come up with the historically more or less accurate anecdote that links basic research to some application relevant to the human condition, though of course only positive examples tend to be trotted out, while the interested parties willfully ignore the negative ones (basic research in atomic physics led directly to Hiroshima and Nagasaki, for instance). But the fact is that I have never actually seen a serious historical or sociological study of the serendipitous paths that lead from basic stuff to interesting applications (this article, focusing on math, does report on some more systematic attempts in that direction, but it still feels very much as cherry picking).

Yet, it seems that an evidence-based community such as the scientific one (the problem of applications doesn’t even arise in the humanities, obviously) would be interested and capable of generating tons of relevant data. What gives? Could it be that the data is out there and it doesn’t actually back up the official party line? Possibly. More likely is that the overwhelming majority of scientists simply doesn’t give a damn about applications of their research (again, the issue isn’t really one that humanists are even confronted with; and besides, have you compared the budgets of a couple of typical philosophy and physics departments recently?). I certainly didn’t care about it when I was a practicing evolutionary biologist. I did what I did because I loved it, had fun doing it, and was lucky enough to be paid to do it (in part, the other parts being about teaching and service). Oh, yes, I to dutifully wrote my “impact statement” for my National Science Foundation grants, in which I allegedly explained why my research was so relevant to the public at large. But the truth is that everybody’s statement of that sort is pretty much the same: disingenuous, very short on details, and usually simply copied and pasted from one grant to another.

Which brings me to response number two: it’s intrinsically interesting. I never understood what one could possibly mean by that phrase other than “it is interesting to me,” which is rather circular as far as explanations go. Perhaps we could get scientists to agree that, say, research on the origin of life is “intrinsically” more interesting than the sequencing of yet another genome, or the study of yet another bizarre mating system in yet another obscure species of insects. But then one would expect much of the research (and funding!) being focused on the origin of life question rather than on those other endeavours. And one would be stunned to discover that precisely the opposite happens. In fact, as John R. Platt, a biophysicist at the University of Chicago, famously wrote in an extremely cogent article on “strong inference” published in Science in 1964: “We speak piously of ... making small studies that will add another brick to the temple of science. Most such bricks just lie around the brickyard.”

There is a third way to show that what you do is worth the university paying for, one that is increasingly welcomed by bean counting administrators of all stripes — from NSF to your own Dean or Provost: impact factors. These days, in order to make a case for your tenure, promotion or continued funding, you need to show that your papers are being cited by others. Again, the game largely concerns the sciences, since the number of scientific journals, scientific papers, and their consumers vastly outnumbers those of humanist fields. (I can easily catch up with pretty much everything that gets published in philosophy of biology these days, but the same feat was simply impossible for any human being when my field was evolutionary biology — and the latter isn’t that large of a field compared to other areas of biology or science more broadly!)

The problem, of course — as pointed out by Tim Harford in the article mentioned above about mathematics — is that this solves precisely nothing, for a number of reasons. First, because impact factors, despite the fact that they are expressed by numbers, still reflect the qualitative and subjective judgment of people. Yes, these are fellow experts in the relevant scholarly community, which is certainly pertinent. Firstly, scientific communities tend to be small and insular, as well as prone to the usual human foibles (such as jumping on the latest band wagon, citing papers by your friends and avoiding those of your foes like a pest, indulging in a comically absurd number of self-citations, etc.). Secondly, impact factors only measure the very short term popularity of particular papers, not the long term actual impact of the corresponding pieces of research. Perhaps that’s the best that can be done, but it really doesn’t seem even close to what we’d like. Thirdly, no impact factor actually measures anything whatsoever to do with “impact” in the broadest, societal, sense of the word. Which brings us back to the original question: why should society give money to such enterprises, and at such rates?

The answer is prosaically obvious: because society gets a pretty decent bargain out of allowing bright minds to flourish in a relatively undisturbed environment. Academic careers are hard: you need to get through college, five to seven years of PhD, one, two, more often than not three postdocs, and seven more years of tenure track, all to land a stable job (undoubtedly a rare commodity, especially in the US!), a decent but certainly not handsome salary, and increasingly less appealing (but still good) benefits. Oh, and a certain amount of flexibility as to when and how much to work. (None of the above, of course, is guaranteed: the majority of PhD students do not find research positions in universities, period.) Trust me: nobody I know in the academy goes through the hell of the PhD, postdoc and tenure process just so that she can (maybe) land a permanent job with flex time. We all do it because we love it, because — like artists, writers, and musicians — we simply cannot conceive doing anything else worthwhile in our lives. (Incidentally, the term “scientist” was coined by William Whewell, a philosopher, in the 19th century; it was in direct analogy to “artist” — an analogy that is more meaningful than most modern artists and scientists seem to realize.)

Passion is, after all, the same response one gets from non-scientific academics (who usually can’t fall on the “what I do matters to society in practical ways” sort of defense). It’s also why civilized nations support (yes, even publicly!) the fine arts: scholarship and artistic creativity simply make our cities and countries much better places to live.

Of course, something tangible is (indeed, ought to be) required of academics in return. And this something is to be found in the other two areas (outside of scholarship) on which academics are judged by their peers and by university administrators (though it would be so much better if the latter simply confined themselves to, well, administration): teaching and service. And by service I don’t mean that largely (though not exactly entirely) useless and mind-numbing type of “service” one does for one’s own institutions (committees’ membership, committee meetings, committees on committees, and the like). I mean service to the community, which comes in various forms, from writing books, articles and blogs aimed at the public, to giving talks, interviews, and so forth. Service, in my view, means taking seriously the idea of a public intellectual, an idea that would only increase the quality of social and political discourse in any country in which it is taken seriously.

What about teaching? Well we (almost) all do it — unless you are so good at scholarship that the university will exempt you from doing it, a situation that I think is quintessentially oxymoronic (shouldn’t our best current scholars excite the next generations?). But do we do it well? Murrey Sperber, in his Beer and Circus: How Big Time College Sports Has Crippled Undergraduate Education, talks about the myth of the good researcher = good teacher, a myth propagated (again, curiously, without the backing of hard data) by both faculty and administrators. At least on the basis of my anecdotal evidence I am convinced (‘till data will show otherwise) that the two sets of skills are orthogonal: one can be an excellent researcher and a lousy teacher, and vice versa - one can be an excellent teacher while being a lousy scholar (though, obviously, one cannot be a good teacher without understanding the material well).

Sure, you will hardly find faculty members at any university who are both a lousy teacher and a lousy scholar: why would anyone hire that sort of person? But you will find examples of all the other three logical categories (with different admixtures of types depending on what college we are talking about), and I honestly have no idea of what percentage of us falls into each of them. (We all, of course, think that we are above average teachers as well as above average scholars, but that sounds a lot like the sort of wishful thinking that goes on in Lake Wobegon, where all the women are strong, all the men are good looking, and all the children are above average...)

The way I see the bargain being struck between society and scholars these days is this: the scholar gets a decent, stable job, which allows her to pursue interests in her discipline, no matter how (apparently) arcane. Society, in return, gets a public intellectual who does actual service to the community (not the stuff that university administrators like to call “service”), as well as someone who takes her duty to teach the next generation seriously, which means honestly trying to do a good job at it, instead of looking for schemes to avoid it. Sounds fair?

7 comments:

  1. I think the Ph.D. at a research university isn't so much presumed to be a good teacher as a good mentor and leader in the field, plus giving the uni some bragging rights about who is in their stable.

    Lower-level courses are taught by teaching assistants, who may or may not get any training in how to teach. At one of the universities where I did some graduate study (heh, five years), there was a huge controversy about the beginning level sciences being taught by Asian students who were unintelligible to the farmers' children sent to study there. Nobody asked the big question: why do Asian countries value a graduate education in science but not the United States?

    The result of course is that now the Asian countries are kicking the arse of the U.S. economically because of the superior engineering and science students they have sent to us who returned to create industrial powerhouses.

    Now that we've turned our best asset over to other countries, perhaps we should see that the answer is found in the output of all those Ph.D.s who do not go on to replicate their mentors' careers but carve out their own as innovators. We should stop sending these students back to their home countries, and we should stop letting people tell farmers' children that evolution is "just a theory." Those children no longer have a future in farming thanks to the advances in farm technology accomplished by people who don't believe that evolution is "just a theory."

    ReplyDelete
  2. I think that there's another approach that someone can take to defend academia. I think that it's reasonable to assume that at some point in time, everyone takes a larger interest in some specific topic, and what someone will be interested in is very unpredictable. It's an amazing fact that we all have the ability to really explore a topic that greatly interests us, and the only way to really get deeper into a topic is through academic papers and books.
    I think that the attack on academia partly comes from the fact that while every individual is interested and happy to have resources on his or her topic, people don't consider the fact that other people are just as interested in other things. A society that values curiosity and intelligence shouldn't allow for someone to have a question about something without there being any resources to explore the question in more detail. People need to understand that living in a society where any citizen can explore questions ranging from "how does the cell determine when the right time to divide is," to "what does Heraclitus mean by this passage" is truly magnificent, and just plain awesome. It would be a horrible offense if we punish the person who is interested in the topic that is no longer covered because of cutbacks.

    ReplyDelete
  3. Great post.

    shouldn't our best current scholars excite the next generations?

    As you complement later, and I concur, being a good researcher and being a good teacher often do not go well together. And yet, I only have anecdotes. This just reminded of the "Harmony of the Worlds" episode of Cosmos, which I watched again a few weeks ago, where Sagan describes Kepler's life and achievements. Kepler was a horrible teacher, apparently, but an incredible genius in his research. To teach well, one has to not only know the subject, but also to be a good communicator. The good teacher has to be able to make you go "wow, that's interesting, I want to know more". And that does not always come easily to everyone. Some training helps too, I found out, although we almost never get any. The near complete lack of value given to teaching ("waste of time that could be spent doing research or writing the next grant proposal") in most universities does not help the situation, I think. Same goes for mentoring: the great scientist is not necessarily a good mentor, and vice-versa. I've seen both.

    There are the Richard Feynmans of the world, who can do both very well. And there are also the science popularizers, who while decent but by far not top scientists, are still very valuable by inspiring people to follow in their footsteps, or at least understand the world better.

    ReplyDelete
  4. It's off the main topic by far, but I must disagree with a moral "talking point" Massimo makes early on.

    Hiroshima and Nagasaki were NOT "negatives." They saved American lives, probably saved *net* Japanese lives compared to what a lengthy blockade would have done, and by ending the war earlier (ignoring the idea of the A-bomb being a "signal" to Stalin) kept the Cold War in Asia from starting out even messier than it did.

    ReplyDelete
  5. Gadfly,

    > Hiroshima and Nagasaki were NOT "negatives." They saved American lives, probably saved *net* Japanese lives <

    I'm not getting into that discussion here. I simply reiterate my disagreement, and we might have to explore that issue on another occasion.

    ReplyDelete
  6. @Gadfly, I strongly must disagree with you.

    Although Massimo won't get into this discussion here, I found your comments on the subject a little myopic. As someone who spent half a decade living in Hiroshima, I am quite aware of the long term affects of the bombing and the toll it had on the Japanese, both physically and mentally.

    The bombing of Nagasaki and Hiroshima had a negative impact all around. That's just the gist of it.

    Families today still suffer the after effects of the bombings, either directly because of birth defects or else because of the trauma the massive starvation had on them when they were children.

    Of course, the bombing may have had a *net benefit in preventing further deaths, but you are incorrect to assume it saved lives. It killed thousands of people and destroyed entire cities. That doesn't include the numerous fire bombing campaigns either which crippled Japan. Just think about it, we weren't fighting Japan's army, we were attacking innocent school children, farmers, and crippled an entire economy with our bombing of the mainland. I supposed killing off the indigenous population would be one way to end a war... but I don't think you can claim that would be a good or ethical thing to do. But that's exactly what the atomic bomb was designed to do--and the U.S. was the only nation to use nuclear weapons against another nation.

    Consider this, though. What if Hitler had beat us to the punch, and obtained nuclear devices before the U.S.?

    If Hitler had bombed New York and Los Angeles with nuclear bombs... completely obliterating them from the face of the Earth... would this atrocity have been any different ethically than us having used atomic weapons on Japan?

    Hitler could have very well used the same excuse you have--he was preventing any further deaths in the long run by putting the war to an preemptive end.

    Some how I don't think the *net benefits of preventing potential deaths outweighs the harm the atomic bombing actually had.

    Massimo is right, the effects were negative.

    ReplyDelete
  7. Thanks for the great post, Massimo.

    I'm a PhD philosophy student, and much of your public outreach has had a tremendous impact on my goals for the future. Thanks for doing that and setting a good example for fellow whatever-we-are's.

    ReplyDelete

Note: Only a member of this blog may post a comment.