Sunday, February 26, 2012

Redirecting the charitable impulse


by Ian Pollock

Being an extremely didactic dialogue, in which the rules of polite conversation are flouted for the sake of philosophy and truth. All characters are fictional; no interlocutors were traumatized in the making of this dialogue.

Mia: So, what did you get up to this weekend?

Gallant: I spent Saturday playing a basketball tournament for Balls For The Cure.

Mia: Cool, I assume that’s a fundraiser? What for?

Gallant: Testicular cancer. My uncle died of it, so I figured it’s the least I could do.

Mia: Sorry to hear about your uncle. Why raise funds for testicular cancer, though?

Gallant: Because that’s what my uncle died of.

Mia: Right, I know. What I’m saying is... let me see if I can express this properly, without pissing you off. You cared for your uncle, obviously.

Gallant: Obviously.

Mia: And it’s terrible that he’s dead. Why do you care that it was testicular cancer that killed him, though?

Gallant: That is the disease that took him away from us! What do you mean, ‘why do I care?’ Isn’t it obvious?

Mia: No, not really. He would be just as dead if he had died of heart disease. I think the bad thing about your uncle dying is, well, that your uncle suffered and then died. Not that testicular cancer in particular killed him. You’re treating his death as if it were a murder, and we had to punish the culprit — testicular cancer.

Gallant: Well, we do! That disease has taken away too many people’s lives!

Mia: Indeed it has... so has flying debris. But there are other diseases that take away or ruin people’s lives, and some of them ruin many, many more people’s lives than testicular cancer. Wouldn’t it make more sense to focus on curing or preventing the commonest ones, the deadliest ones, and the most easily remedied ones? That way we could help more people with the same amount of time, effort and money.

Gallant: You don’t understand. This isn’t about numbers. That disease took away my uncle, personally.

Mia: Okay, fair enough. I can see you want to honour your uncle’s memory. One way of doing that is by supporting testicular cancer research. That is a good idea, ceteris paribus.

Gallant: I don’t speak Latin, but thanks. I’m glad I have your permission.

Mia: Or you could think about the fact that the bad thing is that he died, not that he died of any one specific disease. You would be just as sad if he had died of a heart attack, wouldn’t you?

Gallant: I guess.

Mia: What I’m saying is that the bad thing here that we should want to prevent is suffering and death for people like your uncle, not testicular cancer per se. If testicular cancer didn’t lead to suffering and death, there would be no need to worry about it very much. So any action that prevents suffering and death is a great way to honour your uncle. Also, you presumably want to honour your uncle because he was a good person. So any action that does good in the world is also a great way to honour your uncle, even if it has nothing to do with testicular cancer in particular — maybe it’s as unrelated as funding a school in a developing nation! Just do the most good stuff, or prevent the most bad stuff, that you possibly can.

Gallant: But none of that is related to my uncle!

Mia: On the contrary, if your uncle was a good person who didn’t deserve to suffer and die, nothing could be more related to him and to his memory than promoting the good, whereas testicular cancer qua testicular cancer is totally unrelated to everything that makes your uncle worth remembering.

Gallant: Okay, look, I’ve got to go, this is my stop. See you.

Mia: I thought you said you were going to downtown...

Gallant: No, I forg... This is my stop.

Mia: Bye!



Mia: Sure, of course this seat’s not taken!

33 comments:

  1. Hey Ian,
    I don’t know where you get the idea that this guy is treating the disease like a murderer; and raising funds for testicular cancer research is like trying to punish the disease. When someone dies of a particular cancer it raises your awareness of that disease. It is natural to want to understand it. When Gallant goes to a big fundraising event he meets others who are suffering a lot like his uncle did. He becomes a part of a community with this common bond. Maybe Gallant helped with his uncle’s care and he can talk about it and give advice. You’re essentially telling us what Gallant ought to care about. I don’t feel like going through the dialogue and showing you where you’re saying what “is” implies what “ought to be”, but I think it’s in there somewhere. And as I already stated, it is natural for someone to want to understand the disease that killed a loved one and get involved in raising people’s awareness of it.

    ReplyDelete
  2. Re: "some of them [diseases] ruin many, many more people’s lives than testicular cancer. Wouldn’t it make more sense to focus on curing or preventing the commonest ones, the deadliest ones, and the most easily remedied ones?"

    No, not at all. Gallant *ought* to apply the appropriate means to achieve his desired end. If his desired end is to see testicular cancer eradicated or to see measurable progress made in testicular cancer treatments, and increasing funding for testicular cancer treatment research would considerably help achieve that end, Gallant, ceteris paribus, *ought* to assist in fundraising activities for testicular cancer research.

    Now, if you intend to assert that Gallant *ought* to have a different end, I would like to see the argument for that. It is not enough merely to assert that what one "should want to prevent is suffering and death for people like your uncle".

    Gallant has an immediate response to that:

    "You see, Mia, it is not my preference to decrease pain and suffering amongst sentient beings en bloc. Rather, it is my preference to see those suffering from testicular cancer assisted with new, more efficient and humane treatments- perhaps even a cure. If you insist that my preference is arbitrary I shall of course agree, but I would then feel the need to point out that your preference to decrease pain and suffering amongst sentient beings per se is just as arbitrary. But of course that my preference is arbitrary is unproblematic for me; perhaps you can identify that this particular preference is inconsistent with various other preferences I hold, at which point I would then be rationally obligated to ameliorate the inconsistency. But you have not done that, thus you have yet to show that my preference is irrational."

    ReplyDelete
    Replies
    1. This is partially a postscript to my own comment below, but it seems more fitting to insert it here as a reply to Eamon:

      Mia, no doubt, has preferences of her own, which in this case appears to include altering Gallant's preference, or (as Ian put it) to "redirect his charitable impulse" towards decreasing "pain and suffering amongst sentient beings en bloc" (as you put it). She may or may not succeed in that regard, but I think it's fair game for her to try (even if Gallant reacts negatively), so long as she refrains from an accusation of irrationality, which I agree would seem misplaced in this case.

      Delete
    2. > Gallant *ought* to apply the appropriate means to achieve his desired end.

      That might include (intuitively) horrible things like killing, and in this case it does. Gallant is probably confused by a host of cognitive biases and social conditioning, and that's why he prefers killing more people to saving fewer people.

      Here's an example of what I'm talking about. I think if you phrased Gallant's decision in terms of a dead children currency, he would likely have different desires--at least I would hope so.

      While his desire to donate to testicular cancer exists, as you and Pollock framed the desire, it's irrational and incoherent: it likely leads to an outcome that if he directly weighed against other possibilities he wouldn't desire.

      Delete
    3. Eid,

      Re: "While his desire to donate to testicular cancer exists, as you and Pollock framed the desire, it's irrational and incoherent: it likely leads to an outcome that if he directly weighed against other possibilities he wouldn't desire."

      In part, I agree. If Gallant's desire to donate to testicular cancer research is in conflict with other, equally held desires, Gallant is rationally obliged to order his desires so as no longer conflict. However, he may order his desire to donate to testicular cancer research higher than his desire to see the overall amount of suffering amongst sentient beings decreased, or vice versa.

      Until and unless, however, one can show that Gallant's preference orderings are inconsistent, one cannot say his desires are irrational.

      Re: "Gallant is probably confused by a host of cognitive biases and social conditioning, and that's why he prefers killing more people to saving fewer people."

      This strikes me as base assertion. For all we know, Gallant has rationally ordered his preferences / desires. Take my self, for example. I could care less about malarial drug research as the probability of either myself or a loved one suffering from malaria is miniscule. I would much prefer to donate to causes which would more directly benefit myself, such as anti-AGW causes, and my preference is eminently rational.

      It seems to me you are taking a lot for granted here.

      Delete
    4. Clarification:

      I meant to say donate to causes which combat anthropogenic global warming and climate science ignorance in general.

      Delete
    5. >However, he may order his desire to donate to testicular cancer research higher than his desire to see the overall amount of suffering amongst sentient beings decreased, or vice versa.

      Of course he "may"--such a trivial thing to say. Based on the point that Ian is making, the context you ignore, it's likely that he wouldn't. And Ian's point stands because that is the case with most people. Put a thousand anonymous children with malaria in front of them, and hundred men with testicular cancer, and tell Gallant that he has to choose which group is gased, and I think you'll see his unbiased desires (as well most anyone that purports to desire saving the people with testicular before their scope insensitivity is removed).

      >This strikes me as base assertion.

      No, everyone's decisions are manipulated by cognitive biases; there's even a bias for people that belief they aren't manipulated by cognitive bias.

      >I would much prefer to donate to causes which would more directly benefit myself, such as anti-AGW causes, and my preference is eminently rational.

      Not "necessarily". Hurr hurr hurr...

      >It seems to me you are taking a lot for granted here.

      You're ignoring Ian so you can rant about yourself.

      Delete
    6. Eid,

      If you want to discuss the issue in a rational manner, we can do that. But if you instead would rather carry on about this and that cognitive bias and rant about making this or that inference from characters in an imaginary discussion, you can do that by yourself.

      Now on to more substantive matters.

      Whether Gallant orders his preferences so that he ranks donating to testicular cancer research above donating to anti-malarial research and other similar preferences, or vice versa, is not trivial- it is the very basis upon which we can judge his preference rational or not.

      Re: "Put a thousand anonymous children with malaria in front of them, and hundred men with testicular cancer, and tell Gallant that he has to choose which group is gased, and I think you'll see his unbiased desires (as well most anyone that purports to desire saving the people with testicular before their scope insensitivity is removed)."

      I fail to see how this imaginary situation is germane to the issue under discussion. It is not sufficiently similar to the decision of which research programme to which to donate. Choosing to gas a group of people (just like choosing to push someone onto train tracks) is akin to actively murdering. It is entirely plausible that Gallant is a deontologist or a rational egoist / contractarian (pace David Gauthier or Thomas Hobbes) and thus would on good meta-ethical grounds deny the underlying utilitarian assumption upon which you would seek to draw a similarity between the situations. Ergo, you are taking a lot for granted here.

      For myself, I understand full well that thousands of children are dying from hunger and hunger related diseases, but I also reject that I have a moral duty to intervene. Now, do you care to address my view on this matter or would you rather content yourself with asserting why I fail to agree with you is that because I am suffering from some cognitive bias?

      Delete
    7. Eid
      Are you making predictions about what people WILL choose....or what they OUGHT rationally to do.

      Delete
    8. Eamon: I understand full well that thousands of children are dying from hunger and hunger related diseases, but I also reject that I have a moral duty to intervene.

      I accept that it's morally praiseworthy to relieve human suffering, but then I'm a sentimentalist at heart (in the Humean sense that "Reason, being cool and disengaged, is no motive to action"), and numbers are not nearly as motivational as the thought or mental image of someone - especially a loved one - suffering.

      Delete
    9. DJD,

      Yes, I'm predicting if the emotional and physical distance from the two groups were removed, most people would choose differently. And, therefore, if they would choose differently, then we can say they were biased in their original choice and it was irrational. They are free to be irrational, however.

      Delete
  3. This dialog reminds me of a biology professor whom I once knew, whose job (when he wasn't teaching) was to conduct research into a cure for malaria. For similarly utilitarian reasons, he expressed a certain degree of resentment towards HIV/AIDS charities & research labs for being better funded, since malaria kills more people annually than HIV/AIDS. (This was the mid-90's, but I assume that the relevant facts are roughly the same.)

    I see it like this: People naturally assign greater priority to finding solutions to problems that impact their lives (or have done so in the past or stand a good chance of doing so in the future) - even when the problem seems trivial when compared to, say, the problems routinely experienced by the world's poor. (The image of a rich white male suffering from erectile dysfunction comes to mind.)

    That said, perhaps Gallant should fight this tendency and move on, focus his charitable efforts on more rampant social/epidemiological problems.

    But, then again, perhaps Gallant isn't usually one for charity to begin with, and the only reason that he's motivated to participate in this case is because his uncle's death awakened him to the fact that he or some other relative is also at risk.

    Granted, this is a somewhat less altruistic Gallant than the one Mia appealed to, and perhaps a better informed Gallant will have learned that testicular cancer should be the least of his worries. My point here is that we have to work with humans as they are, not as how we wish they would be. And most of us are not so utilitarian in our thoughts and actions.

    ReplyDelete
    Replies
    1. > My point here is that we have to work with humans as they are, not as how we wish they would be.

      This is mind-numbingly stupid. Because humans are naturally shitty, doesn't mean that they should be naturally shitty. The point of Ian's essay is to help humans see how they're being shitty and hopefully inspire them to change--like he probably already has.

      In experimental philosophy experiments, philosophers choose utilitarian outcomes to a much larger degree than morally ignorant people. So, obviously we don't have to work with humans as they are. We can hopefully educate them like Ian did with this essay (that you ignored).

      Delete
    2. Re: "The point of Ian's essay is to help humans see how they're being shitty and hopefully inspire them to change--like he probably already has."

      No, Ian presumes Gallant is "being shitty" and proceeds to make no argument to substantiate the meta-ethical assumptions upon which that presumption is based.

      Delete
    3. Despite your rudeness, I'll respond this once to you, in case you didn't understand me (even though Eamon apparently did).

      My point was that, if Gallant's impulse is truly more self-interested than his replies to Mia reveal (e.g. if he acts under evidence-based medical advice that he and his close male kin are also at risk of testicular cancer), then her attempts to impose her own altruistic will on him may very well fall flat.

      As my postscript above suggests, I don't necessarily blame Mia for trying - such is the nature of rational persuasion, and I've got nothing against genuine altruism - but I also don't necessarily see a self-interested Gallant as irrational or immoral for pursuing his own interest - especially if the consequences are better for others like myself (albeit, sub-optimal from a utilitarian standpoint) than if he had never felt motivated to do charity work to begin with.

      As for what the experimental data in moral psychology has to say about the moral stature of utilitarians (or at least strong, consistent ones), consider this finding:

      "Participants who indicated greater endorsement of utilitarian solutions had higher scores on measures of psychopathy, machiavellianism, and life meaninglessness. These results question the widely-used methods by which lay moral judgments are evaluated, as these approaches lead to the counterintuitive conclusion that those individuals who are least prone to moral errors also possess a set of psychological characteristics that many would consider prototypically immoral."

      Delete
    4. >Mia: Sorry to hear about your uncle. Why raise funds for testicular cancer, though?

      >Gallant: Because that’s what my uncle died of.

      Given that, it's fair to presume Gallant is ignorant and being shitty--it's the default human behavior given ignorance. Moreover, it's a short didactic dialogue on a blog, not a treatise; you don't get to interpret it in the worst possible light.

      Delete
    5. >>he expressed a certain degree of resentment towards HIV/AIDS charities & research labs for being better funded, since malaria kills more people annually than HIV/AIDS.<<

      Just by way of random tidbits, I took a population econ seminar where we learned that from an economic/"utilitarian" perspective, HIV programs are by far the most underfunded of major diseases. The reason is partly because because preventative measures are so cheap that to max lives saved we'd want to allocate more money to them than we do (it's so underfunded we'd still get far more lives for the buck than the alternatives). But it's also b/c HIV/AIDS has subtle spillover effects: among other things, it hits people in their working age (vs. other diseases that hit the population more uniformly or hits mainly the elderly), meaning African economies that invested in the victims' education basically lose their investment, making HIV an important contributor to African poverty, which in turn hampers efforts against other diseases.

      Take all of this with a grain of salt - I poked around a bit on Google Scholar and couldn't find an inter-disease cost-benefit analysis of the kind I remember from the class.

      --Also, mufi, I'm sorry to say I'd dispute your (implicit) interpretation of the study you cite. More to the point, so would the authors. The issue is that the trolley hypo classically does not look at self-sacrifice, as Bartel touches on.

      Delete
    6. >My point was that, if Gallant's impulse is truly more self-interested than his replies to Mia reveal (e.g. if he acts under evidence-based medical advice that he and his close male kin are also at risk of testicular cancer), then her attempts to impose her own altruistic will on him may very well fall flat.

      That's why I said you were ignoring the essay (i.e. you're ignoring the point it makes and being hypercritical about something else).

      >but I also don't necessarily see a self-interested Gallant as irrational or immoral for pursuing his own interest - especially if the consequences are better for others like myself (albeit, sub-optimal from a utilitarian standpoint) than if he had never felt motivated to do charity work to begin with.

      It is both irrational and immoral if it conflicts with his unbiased desires and he was capable of making a better decision--I'm referencing agency more than free will. Yes, he may have never done charity work otherwise, but then in his new desire to do charity work you have an opportunity to redirect his biased desires to better fit his unbiased desires.

      >As for what the experimental data in moral psychology has to say about the moral stature of utilitarians (or at least strong, consistent ones), consider this finding:

      Irrelevant; it's a group of lay people. The high scoring ASPDs could be right for the wrong reasons.

      Delete
    7. Eid,

      Re: "Given that, it's fair to presume Gallant is ignorant and being shitty--"

      It is not fair to presume Gallant is ignorant and being shitty.

      Delete
    8. I'll admit this much: My interpretation is not the plainest one in that it assumes that Gallant at times answered Mia in ways that do not always reflect his true motivations accurately. (I actually think that's more realistic, given certain lessons from experimental psychology and cognitive science, but it's not usually a working assumption in philosophy.)

      But what's an "unbiased desire"? And why should anyone assume that Mia's more altruistic impulse will necessarily Gallant to a "better decision"? Better for whom?

      You seem to be assuming way more than you've actually taken the time to present and defend rationally (if that's even possible). I suspect that's also why you so quickly rejected that study on utilitarians that I cited, but your response was too cryptic to explain, so I'll just let that one go.

      Delete
    9. Timothy: Thanks for the polite feedback.

      I do think that utilitarianism is undermined as an ethical theory by that study insofar as it demonstrates how overtly utilitarian-like acts (e.g. those which would sacrifice the few for the many) can be misconstrued as being motivated "out of genuine concern for the welfare of others." But I admit that it's more of a technical flaw than a logical one.

      Also, my recollection of the trolley experiments is that most subjects refuse to push an innocent stranger onto the tracks, so as to save more people. (In other words, they shift into a non-utilitarian mode.) I cannot adequately defend the majority decision rationally, but nor can I adequately defend the minority (utilitarian) one rationally. The best that I can do is to non-rationally assert that I approve (at a visceral level) of the majority decision and disapprove (at a visceral level) of the minority one. What's more, I've yet to encounter a philosophical argument that can prove that the majority decision is wrong.

      Delete
    10. Mufi,

      >Mia: What I’m saying is that the bad thing here that we should want to prevent is suffering and death for people like your uncle, not testicular cancer per se. If testicular cancer didn’t lead to suffering and death, there would be no need to worry about it very much. So any action that prevents suffering and death is a great way to honour your uncle. Also, you presumably want to honour your uncle because he was a good person. So any action that does good in the world is also a great way to honour your uncle, even if it has nothing to do with testicular cancer in particular — maybe it’s as unrelated as funding a school in a developing nation! Just do the most good stuff, or prevent the most bad stuff, that you possibly can.

      Is the thrust of Ian's point. The characters should be interpreted in that light. Gallant is the foil in this dialogue; realize that.

      >But what's an "unbiased desire"?

      In this context, a biased desire is where you would desire something else if the distance was removed from you to the benefactors. For example, your father died of testicular cancer, and now you want to fight against it to the exclusion of other things (saving dying children). Except, you're presented with a room (in front of you) with a thousand dying children and another room with a hundred men dying of testicular cancer, and you now choose to save the children. If that were the case, your desire would have been biased by the anchoring effect (or sphere of immediate awareness, dunbar's number, the expanding circle, scope insensitivity, and so on) of your father's death.

      >And why should anyone assume that Mia's more altruistic impulse will necessarily Gallant to a "better decision"? Better for whom?

      Mia is revealing to the audience Gallant's bias, but I'm starting to think Ian underestimated the inferential distance in his audience.

      > I suspect that's also why you so quickly rejected that study on utilitarians that I cited, but your response was too cryptic to explain, so I'll just let that one go.

      Your simplified claim is that utilitarianism is immoral because immoral psychopaths are utilitarians in specific instances. That's false because they simply could be right for the wrong reasons. I.e., they have a broken 'moral module' that gives them an intuition for utilitarianism in that situation. However, that doesn't mean the conclusion is wrong, it only means the intuition is wrong. By saying the conclusion is wrong because the intuition is wrong you're making a kind of fallacist's Fallacy.

      Delete
    11. >It is not fair to presume Gallant is ignorant and being shitty.

      I already explained to you why it is fair to presume that.

      Delete
    12. Eid
      You seem to be totally convinced that you know both what is morally right and wrong and what is rational and best for mankind. You are glaringly self-deceived on both counts. The lack of a foundation for your claim to moral objectivity or absolutism begs the question of "What god told you that?" Or "What source of authority do you adhere to?"/..And your belief regarding what is rational is arguable. Do you really think it is a good policy to have all contributions be given to one and one only effort to overcome diseases? Would you mandate that 100% of all govt. spending on disease eradication be spent only on African diseases of childhood? If not....what method of apportionment would you recommend?

      Delete
    13. In this context, a biased desire is where you would desire something else if the distance was removed from you to the benefactors

      I must have missed where you explain what's morally wrong about a biased desire. I'm still missing it.

      Your simplified claim is that utilitarianism is immoral because immoral psychopaths are utilitarians in specific instances.

      Even non-psychopaths apparently behave like utilitarians in specific instances. What that study suggests is that the psychopaths apparently behave like utilitarians more consistently - i.e.. even in the active-push scenario, wherein non-psychopaths reject the solution (presumably because it feels like murder).

      The point is that it's hardly a feather in the cap of utilitarianism if moral psychologists identify the minority decision of psychopaths and Machiavellians as "utilitarian." Even more to the point, remember what prompted that reference: your claim that "philosophers choose utilitarian outcomes to a much larger degree than morally ignorant people." Yeah, and so do psychopaths. So what?

      Delete
    14. >I must have missed where you explain what's morally wrong about a biased desire. I'm still missing it.

      Because I didn't claim biased desires were immoral, they're irrational.

      > The point is that it's hardly a feather in the cap of utilitarianism if moral psychologists identify the minority decision of psychopaths and Machiavellians as "utilitarian."

      > even in the active-push scenario, wherein non-psychopaths reject the solution (presumably because it feels like murder).

      Although, around 50% of the people that reject the solution, would still push a brother off to save five brothers while still claiming that it's an immoral act. Maybe people derive 'moral' and 'immoral', in certain cases, from law or religion instead of intuition. While someone that's philosophically trained isn't going to assume what's lawful is moral. I wonder how 'legalize weed' potheads or libertarians would perform in the trolley experiment.

      >The point is that it's hardly a feather in the cap of utilitarianism if moral psychologists identify the minority decision of psychopaths and Machiavellians as "utilitarian."

      Well, yeah it's not a feather in utilitarianism's cap, but it's also not a strike against it. It's irrelevant, like I said.

      > remember what prompted that reference: your claim that "philosophers choose utilitarian outcomes to a much larger degree than morally ignorant people." Yeah, and so do psychopaths. So what?

      Keep up; I used that example to show you that educating people can severely alter their moral decisions, not that it's more moral to be utilitarian because philosophers are utilitarian. Although, I might argue for that but it would be a digression.

      Delete
    15. DJD,

      >You seem to be totally convinced that you know both what is morally right and wrong and what is rational and best for mankind.

      I was careful not to do that. I do know, however, that's irrational to save 100 men with cancer (when you're anchored to them emotionally), than to save 1000 children -when- you would do the opposite (as I suppose most people would) if the emotional and physical distance was removed from the potential benefactors. E.g, you're standing in front of a room with 100 men with cancer and a room with 1000 sickly children, and you have to decide which room gets gassed.

      >You are glaringly self-deceived on both counts.

      You're actually glaringly unable to comprehend what people intend--a habit of yours.


      >begs the question

      *raises the question

      >And your belief regarding what is rational is arguable.

      Go on then.

      >Do you really think it is a good policy to have all contributions be given to one and one only effort to overcome diseases? Would you mandate that 100% of all govt. spending on disease eradication be spent only on African diseases of childhood? If not....what method of apportionment would you recommend?

      You want a static answer for a dynamic problem. There isn't one. The dynamic answer is to simply aspire as best you can to be rational in your decisions. If giving aid to African children isn't achieving your unbiased goals, then it isn't rational.

      It's actually a kind of virtue ethicist approach, only the goal is to be a rational agent rather than virtuous. (Definition from wikipedia if you're not aware: a rational agent is an agent which has clear preferences, models uncertainty via expected values, and always chooses to perform the action that results in the optimal outcome for itself from among all feasible actions)

      Delete
    16. Eid
      One cannot argue or choose "rationally" from a vacuum. You seem to think that it is a given that everyone would care more about the African children than a few less Jewish or Asian children. That' quite an assumption. Also, why would anyone want to put aside the emotional and physical distance between themselves and any others. You have chosen certain values over others....then assumed them as the goal that everyone has or ought to have....then you argue that it would be irrational to make a choice that would not lead to that goal.

      Delete
    17. Eid
      >"Because I didn't claim biased desires were immoral, they're irrational"
      Biased desires are not irrational. Most desires are biased desires. Perhaps all desires are biased. I desire chocolate ice cream rather than vanilla. Is that a biased desire? By the way...morality is a cultural construction....every bit as much as regulations, statutory law, and rules of baseball. People formulate and promulgate these moral rules....they don't exist before we invent them. You don't go looking for morality....you create it as a tool to bring about order and security.

      Delete
    18. Because I didn't claim biased desires were immoral, they're irrational.

      A claim that I reject for more or less the same reasons that Eamon does.

      I'll just add here that you seem to work under the (unrealistic and out-dated) assumption that reason (or Reason) exists out there in the ether, disembodied and living freely of human desire, motivation, and bias.

      If so, and a majority of philosophers do indeed share that assumption (and thereby "choose utilitarian outcomes to a much larger degree than morally ignorant people" under your terms) then I say: so much the worse for philosophy.

      Delete
  4. Mufi,

    In large part, well said.

    Re "And most of us are not so utilitarian in our thoughts and actions."

    I simply fail to find utilitarianism, whether of the rule or act variety, rationally compelling.

    ReplyDelete
  5. This is from a different angle from the comments thus far, but it is where my mind went. The implications of advocacy on the individual level (as in this example) are different than on a societal level or public policy perspective.

    Since there are many diseases that are more "costly" in terms of death and suffering (in comparison to the testicular cancer example), even if all people acted just as Gallant does- this would result in the most "costly" diseases getting the most people contributing. In other words, Gallant's actions would be balanced out by others using the same approach, and the more damaging and common diseases would "motivate" more people. In reality however, advocacy is much more organize, and many diseases get more funding do to this organized advocacy (e.g. breast or prostate cancer vs pancreatic cancer)

    There is another aspect of human nature that we shouldn't overlook when it comes to advocacy. Without the motivation of his uncle's death, Gallant might have been sitting on his couch at home eating Cheetos. So, we can compare what he is doing to what better things he could be doing, or we can compare it to what he would be doing without this motivation. What was Mia doing while Gallant was at the fundraiser?

    ReplyDelete
  6. Quite amusing - we are currently watching old Bones episodes, and in my mind Mia and Gallant quickly took on the voices of Brennan and Booth arguing in the car. In this context it would be ok to describe Gallant as ignorant (i.e. not questioning his motivations) but if anyone is shitty here it is Mia (in the meaning of insensitive, i.e. not getting motivation in the first place).

    ReplyDelete

Note: Only a member of this blog may post a comment.