By Massimo Pigliucci
Julia has recently made a good case for transhumanism, easily rebuffing some of the most popular, and somewhat inane, objections to the “movement.” She started her post by reminding us that transhumanism is “the idea that we should be pursuing science and technology to improve the human condition, modifying our bodies and our minds to make us smarter, healthier, happier, and potentially longer-lived.” She immediately acknowledged my retort during a recent Rationally Speaking podcast that if one defines transhumanism that way, we don’t need a movement, because we have been doing just what transhumanists advocate since the dawn of civilization. In fact, what transhumanism is about is the promotion of radical alterations to human bodies and minds, through a variety of futuristic technologies that include genetic engineering and “mind uploading” (whatever that means).
Julia takes on what she sees as the three major objections to transhumanism: it robs life of meaning; it's dehumanizing; it's hubris; and rightly ends up making fun of arguments proposed by people like Francis Fukuyama, the famed author of The End of History (man, is there a more famous writer who got so many things so wrong during the past couple of decades?). Even there, though, Julia is not quite fair to Fukuyama and like-minded critics: “Our ancestors were less knowledgable, more tribalistic, less healthy, shorter-lived; would Fukuyama have argued for the preservation of all those qualities on the grounds that, in their respective time, they constituted an ‘essential human nature’?”
No, he wouldn’t. There is a fundamental difference between improving the human lot through medicine, agriculture, and other technologies on the one hand, and permanently and radically altering the human genetic makeup on the other hand. It is the latter that Fukuyama and other critics of transhumanism are objecting to, not the former. That objection cannot be dismissed simply on the basis that there is no such thing as human nature (really? Do you mean to say that there are no fundamental, if quantitative, differences between us and chimpanzees? And yet I just haven’t felt like having sex with a chimp lately), or that it is an ill-conceived and somewhat perverse longing for a tragic view of the human condition (“If we all live long healthy happy lives, [transhumanism critics’] favorite poetry will become obsolete”).
First off, there are many more types of criticism (and number of critics) of transhumanism than one would glean from Julia’s post. (A surprisingly good summary with informative links can be found here.) This is not the place to examine them one by one, so let me give you the run down of my own objections instead.
First, transhumanism is irrelevant. We do not need a “movement” to tell us that we need to pursue technologies that will improve and/or radically alter the human condition. It will happen anyway, because human nature (oh yeah, baby) favors whatever can stave off disease and death as much as possible, at pretty much any cost — the consequences be damned.
Second, transhumanism is simply another version of futurism (which is itself a form of what I call techno-optimism, in turn an outgrowth of scientism — is that ‘nough “isms” for ya?). And if there is one thing we know about futurists is that they are almost always both spectacularly wrong and absolutely inconsequential to humanity’s actual technological developments.
Third, there are potentially serious issues of hubris at play. I am by no means a Luddite, but we already have enough trouble managing a number of complex systems that we have been recklessly playing with (the environment and our own ecosystem come to mind, not to mention the economy), so one would think that when it comes to irreversible changes to the human genome we might want at the very least to tread lightly.
Fourth, just like with any other technology, there are serious issues of access, fairness, and protection from abuse. And if the technology really promises to dramatically extend human life while also improving its quality, it is far too easy to imagine scenarios under which a privileged few would have control and access, or ways in which the technology could be used for nefarious purposes (from exploitation of other human beings to military applications).
Fifth, there is the issue of priorities. We have limited resources in terms of funds to invest in scientific research and technological development, as well as in terms of number of competent scientists to do the work. If the goal is to reduce human suffering, should we not therefore prioritize, oh I don't know, mundane an admittedly not techno-sexy things like already eradicable diseases that kill millions? Or famine? I'm just saying.
Fifth, there is the issue of priorities. We have limited resources in terms of funds to invest in scientific research and technological development, as well as in terms of number of competent scientists to do the work. If the goal is to reduce human suffering, should we not therefore prioritize, oh I don't know, mundane an admittedly not techno-sexy things like already eradicable diseases that kill millions? Or famine? I'm just saying.
Sixth, there are the (potentially disastrous) ecological consequences of a humanity bound to a single and very finite (in terms of space and resources) planet, and yet capable of both reproducing and living forever. You do the math: unless transhumanist immortality comes with mandatory castration (try to sell that to your fellow human beings!), it would take a very short number of years to cause a planetary collapse due to overpopulation.
(I have a seventh, more Jon Stewart-inspired argument: do you really want someone like Sarah Palin or Karl Rove to live forever, or for their “minds” to be uploaded and replicated in countless computers? That would really scare the shit out of me.)
I do share Julia’s bewilderment at the Christian-sounding arguments that some secularists have brought to bear on this debate. I do not think that human suffering (or any other species, for that matter) is a good thing. I do not support Mother Theresa’s type bullshit about the positive effects of sharing the suffering of our Lord Jesus Christ. But that doesn’t mean that there are no extremely serious issues with what the transhumanists wish to do, and given humanity’s track record so far in dealing with complex problems, I’m not exactly an optimist on this one either.
Fortunately, I don’t really think transhumanism is a threat to anyone, just like no futurist has ever been. These movements are populated by naive optimists with a fairly high degree of narcissism, but they are otherwise mostly harmless.
On behalf of the transhumanists: Ouch.ReplyDelete
On behalf of Julia (who can correct me if I'm wrong): I think we agree on the fundamentals of my argument.
You say a movement is not necessary, but then list a bunch of criticisms that are basically unanswered questions. Wouldn't a movement around this issue be well placed to try and answer those questions?ReplyDelete
Also, you said, "really? Do you mean to say that there are no fundamental, if quantitative, differences between us and chimpanzees?"
That seems like a false choice. Even acknowledging the existence of human nature, whatever differences would exist between "human" nature and "transhuman" nature, I'm sure neither would be the same as chimp nature. If we were to change human nature, we'd still be unique from other species.
Also, I don't think we can say that transhumanism is simply a different version of futurism. And I definitely wouldn't describe it as an art movement, as your wiki link does. At best, we could say that transhumanism is a specific subset of futurism. And if they are not then identical, what's the problem with having a name for the subset?ReplyDelete
As per your last paragraph, I think you could have stopped with "they're just another type of futurist". I'm not sure how productive it is to speculate on the practical and ethical difficulties of an implausible technology... heh.ReplyDelete
I think your sixth criticism is fairly weak. You're predicting a Malthusian catastrophe and ignoring how wrong similar predictions were in the past. So, for example, during the Green Revolution there were predictions of an impending catastrophe because of a huge acceleration in population growth as a result of increased food output. These predictions failed because they were ignorant of societal phenomena such as the demographic transition and the accelerating nature of technological advancement. Now fourty years after the Green Revolution we're seeing decreasing birth rates in industrialized countries and obesity epidemics. Still, some populations are still starving but this isn't because we don't have the resources. Golden rice, a technological advancement, could feed these people.ReplyDelete
Your criticism makes similar mistakes:
1) You assume the world's population is accelerating. The United Nations Population Fund estimates that human population will peak around 2050 and then start declining. Also, The UN Food and Agricultural Organization predicts world food production will be in excess of needs by 2030.
2) You assume we'll be bound to our planet. We're already capable of interstellar travel, see: Project Orion, Project Daedalus, beamed propulation and magnetic sails. The problem with these technologies is lack of motivation and our short lifespans. Both of which will be cured in your scenario.
3) You assume castration is the only way to effectively reduce population growth. Social changes, government programs, better contraceptives will all have significant effects.
4) You assume the human species isn't adapative. I don't think I need to debunk this one.
"do you really want someone like Sarah Palin or Karl Rove to live forever, or for their “minds” to be uploaded and replicated in countless computers?..."ReplyDelete
The objections to transhumanism are the same as the objections to any other proposed utopias so far. truly religiously inspired, making promises of human redemption and an age of happiness.
Techno-enthusiasts, having little time for history and the nuances of the social and political sciences, are naturally prone to root-of-all-problems naiveties and dismiss laws of unintended consequences, skeptical distrust of political power, and so on.
Even if their predictions could be taken serious, I wouldn't put my trust on anyone with such hypothetical technology in their hand, least of all transhumanists.
Since Julia didn't address it, and due to your 'harmless' comment, I once again have to harp about the lunatic transhumanists who are in fact trying to facilitate technocratic hegemony based on Gibson-esque masturbatory fantasies. -Which yes you mention- BUT...! What's up with the pinky dismissal as if all they were after bordered on mere annoyance?ReplyDelete
Another Transhumanist Fascist
Can you watch this (roughly 12 minute) video, and call people like him harmless? (There'll be an ad at the beginning masquerading as the actual content)
And no this isn't some speculative youtube bullshit, in case you don't want to waste your time because 'this ain't your day job,' the man is a "legitimate" scientist... sorry to default to a less than academic source but here's his wikipedia page Kevin Warwick
You were so spot on... que paso?
As a longtime transhumanist and someone who's been active in the leadership of our community for years, I've written an article responding to Massimo's points - http://ieet.org/index.php/IEET/more/treder20101004ReplyDelete
Complete agreement with the gist of it - the most optimistic and radical elements of the movement come across as naive and unrealistic, and the less radical ones are a movement to promote and celebrate things that will happen anyway.ReplyDelete
I wonder, however, why you need to see techno-optimism as an extension of scientism. Should the latter not refer to the position that science can give us all relevant answers about the world, rather than to the position that science can find solutions to any problem? The first is a claim that sees the humanities as superfluous and/or mere sophistry, the second is simply exaggerated optimism, but has nothing to say about the usefulness of moral philosophy, gender studies, ethnology etc.
You're predicting a Malthusian catastrophe and ignoring how wrong similar predictions were in the past.
Let me rephrase that to make your logic a bit clearer:
The Malthusian catastrophe did not happen immediately when first predicted, therefore we will never run out of any resources whatsoever and do not need to worry about reproducing and increasing our consumption to whatever level we want to.
Re Mike Treder:ReplyDelete
Have read the defense now, and it seems that the most thoughtful and reasonable elements of transhumanism have covered all the bases. In that sense, I understand most of Massimo's criticism to be directed at the surprisingly vocal and numerous representatives who very surprisingly do not follow the TRANSHUMANIST FAQ!!!one!!!1!! party line in all details at all time. I see your "take on the most sophisticated elements of our movement" and raise you one "No True Scotsman".
More importantly, I cannot help but notice that your defense never really addresses the "superfluous" part of the criticism, although it ends on a note that seems to pretend that it did. Perhaps you imagine the vile Luddite forces to be strong enough to suppress brain implants and the eradication of Alzheimer's disease where they somehow failed to stop artificial limbs and in vitro fertilization? Or that it needs a big movement with conferences and blogs to promote the invention of super-AI, because otherwise nobody would research it, like, say, the telephone and car would never have been invented without a similar activist movement promoting them decades beforehand in the 19th cent... wait a second...
First, there isn't a single Malthusian catastrophe. There are many such predictions in response to certain events. The Green Revolution was just one. Second, reasoning that an event will not cause a catastrophe, DOES NOT presuppose that we will "never run out of any resources whatsoever and do not need to worry about reproducing and increasing our consumption to whatever level we want to."
Also, please do not attempt reform and skew my thoughts for me while pretending it's what I really meant.
Alex, nice to see us on the same side of the fence, once in a while. You know, "truth springs..." etc. ;-)ReplyDelete
As for the connection between techno-optimism and scientism, I think it's pretty natural, although certainly not logically necessary. Techno-optimists see science as the answer to all (practical) questions. Scientistically oriented individuals see science as almost all-powerful. Connect the dots.
hmm, a Malthusian catastrophe is inevitable is we allow the population to grow indefinitely, unless we expand on other planets. The likelihood of the latter option, given present knowledge and technology, is perilously close to zero.
The entire point of my comment is that the world's population most likely won't continue to grow. So, I want to ask you why you think the world's population will grow indefinitely and what evidence you have to back up this claim.
Also, you say "[...]unless we expand on other planets. The likelihood of the latter option, given present knowledge and technology, is perilously close to zero." I provided you with examples of interstellar travel that are currently possible, so why do you think expanding to other planets is so unlikely?
hmm, interstellar travel is NOT currently possible. Moreover, where exactly would you go?ReplyDelete
As for my comments on population growth, they are in the context of transhumanism's search for immortality, so current trends are only marginally relevant, at best.
Interstellar travel is currently possible. A nuclear powered spacecraft like Project Orion would be capable of traveling at 0.1c. This is akin to saying fusion is currently possible. The science is already there and has been for a while. As far as where we'd go, I can only speculate that given several centuries (which I think we have), we'd be able to find several habitable planets within range. And at that point, we'd probably be able to use fusion to power the ships. If not, we'd still have other options available.
On population growth, current trends are still important. In 40 years, population growth will have leveled off and start decreasing because of demographic transitions and countries moving from developing and poor to developed (see Hans Rosling). This means the world's population growth will already be under control as these increases in lifespan come about, consequently, giving us plenty of time to adapt rather than having to deal with a population explosion on top of a population explosion.
Really, I think this all boils down a simple question: why don't you think the human species will adapt to increased lifespans (eventually immortality)?
We probably agree on many more issues than not, I just tend to write more when I disagree.
Many Malthusian catastrophes have happened in the past, just never on a global level. Why do you think ancient civilizations have collapsed? Some were destroyed by neighbors, but many had simply grown to the point where they overused their soils and chopped down too many trees. Or they just about managed to sustain themselves for some time at the point of maximum growth, and then come four bad harvests in a row - cue breakdown of the political system, civil war, mass emigrations, all the good stuff.
This comes up again and again: look, our growth rate is slowing a bit, thus we do not need to worry about being too many. This is simply not true. For starters, even if population growth were lowered to 0.1% per generation, we would still grow to the point of starvation eventually. I am also convinced that we are already several-fold too many now; you have to think in centuries here, not just for your lifetime, and at that scale it becomes clear that we are already using much too much freshwater and mined resources, and deteriorating too many soils for our population density to be long-term sustainable. We need several generations of one-child families, and not everybody alive today becoming ageless. If that happens, most of us will die anyway, only of starvation and violence instead of old age.
Seriously, this is about priorities again. We face severe challenges: Global Warming, either overpopulation or aging societies (take your pick, solving one causes the other), limited resources, how to build a stable and sustainable global economy, etc., and some grown-ups actually think what is most pressing now is to avoid a super-AI enslaving us, or personally living long enough to upload your mind. It defies description.
Thank you Massimo for coming back to the subject of transhumanism again after your first criticism inReplyDelete
I commented already then (July 17, 2009, 7:44 PM) that I find the divide between "traditional" and trans-humanists regrettable. Really it should be the same movement! Which does not mean, of course, it has to be the same organisations. It is (ought to be) just different focuses within the same (in a broader sense) movement. (Compare the division between "organised skeptics" and "organised secular humanists" were (in my opinion) a similar discussion applies.)
I can see at least three reasons why there is need for transhumanist advocacy. The first and most important is to resist the quite widely held opinion (most notably among christians, but they are not alone) that it is hubris of man to revolt against her (by god or nature or whatever) given lot of suffering and poverty. And this should also be a natural starting point for making better contact between traditional and trans- humanists. The second is to argue against status quo bias. People have a often a tendency to tacitly assume that the future will be approximately the same as the present, with just some small incremental changes. But a lot of things, not the least technological development, can make it qualitatively different. Related to this comes the third point: technology precausion, which is actually a very important part of transhumanist discussion. A radically different future could be for the better and for the worse. The transhumanist position is that we should study these issus in order to make it as much as possible for the better.
Actually I agree with commenter Eric above. In your original post you actually have implicitly stated the rationale for a transhumanist movement (or at least a transhumanist "submovement" in the humanist movement at large). You state a lot of questions and potential problems, isn't that motive enough for having a movement that addresses those questions?
Your bold assertion that we are confined to planet earth I suspect is a typical example of status quo bias. It seems reasonable that we should be able to send a manned mission to Mars within a few decades and start a colonisation within a few centuries. Of course it is impossible to say when it will be economical to colonise, and also we do not want to disturb Mars before we for research purpose have investigated and documented the virgin planet. But to say we are forever confined to Earth seems to me naive.
Now of course, as in most movements, not all participants are equally careful in their thinking (and even more common, the same person can have a good point on one question but be careless in other analyses). Certainly the kind of recless techno-optimism you criticize does exist in transhumanism. But it is not correct to judge the transhumanist movement from that. The kind of questions you ask are indeed addressed by them.
An intersting example is on your question of priorities, see this dialogue between transhumanist Mike Treder and futurist Jamais Cascio
Here Treder argues that priority has to be given to fight global warming, also in order to have a better chance to handle the more future problem of possible negative side-effects of artificial intelligence. (I will try to come back to Treder's response in this comment thread after I have read his link more carefully.)
a) "...permanently and radically altering the human genetic makeup on the other hand...one would think that when it comes to irreversible changes to the human genome we might want at the very least to tread lightly."ReplyDelete
From what I understand of the science, depending how you define this it is either trivial or nearly impossible with even a modicum of care. Please be more specific as to how this could happen.
b) "And yet I just haven’t felt like having sex with a chimp lately..."
Keep us posted?
c) "We do not need a “movement” to tell us that we need to pursue technologies that will improve and/or radically alter the human condition. It will happen anyway."
"...just like with any other technology, there are serious issues of access, fairness, and protection from abuse. And if the technology really promises to dramatically extend human life while also improving its quality, it is far too easy to imagine scenarios under which a privileged few would have control and access, or ways in which the technology could be used for nefarious purposes..."
How would you label people who think about the consequences of the technology you call inevitable, particularly if they solve such problems and therefore favor the technology?
d) There are the (potentially disastrous) ecological consequences of a humanity bound to a single and very finite (in terms of space and resources) planet, and yet capable of reproducing over five fold every generation. You do the math: unless humanist ethics comes with mandatory castration (try to sell that to your fellow human beings!), it would take a very short number of years to cause a planetary collapse due to overpopulation.
Massimo: would you agree with the above argument?
Hmm, to say that interstellar flight is as feasible now as cold fusion makes my point: neither is anywhere near to actually being feasible. Whether and how they might be in the future, and how distant that future is, is irrelevant to this discussion.ReplyDelete
Population growth would shot dramatically if we extended life significantly for the simple reason that mortality rates would drop. Which makes this a qualitatively different problem from the standard issues regarding population growth (about which, by the way, I agree with Alex's take).
Brian, I'll keep you posted about my lack of sexual desire for chimps. As for the rest, I have already clarified what I think of these arguments.
tonyf, on the contrary, I think there are profound differences between (secular) humanists and transhumanists, and in fact I am beginning to think that the transhumanist movement is actually anti-humanist. Humanists are concerned with human equality and suffering. Transhumanists seem oblivious to the very likely social inequality that would be caused by radical life-expansion technologies (yeah, yeah, they say they are thinking about it...), and they are a narcissistic bunch who doesn't seem to understand that there are way too many other priorities in terms of ameliorating human suffering before we can even think about devoting substantial resources to their quest for the holy grail.
I would love to have a slice of the optimism that is palpable among some of us here.ReplyDelete
Sure we could reverse population growth to Italy or Germany levels if we make everybody as wealthy, educated and socially secure as the citizens of those countries. Just a tiny issue there: if seven billion of us consume as much as my compatriots or the Italians, then we go down even faster. If we don't raise everybody to that economic level, growth rates won't easily go down. It is a catch-22 and rather depressing, but shutting your eyes and pretending it isn't helps nobody.
And I cannot believe we are even discussing interstellar travel! At our current level of understanding, and yes, that is better than most people assume, any spaceship that travels fast enough will be torn apart by the force of individual atoms in the not-entirely vacuum hitting it with nuclear-bomb-like force; any spaceship that travels slow enough will arrive an empty and dead shell, as over long enough time periods gases will diffuse even out of the tightest material imaginable. And that is not even mentioning cosmic radiation, or the unknown scarcity of habitable planets.
Of course, those who are so inclined will immediately start hand-waving all those objections away with "technology will deliver, just you wait". Well, firstly we should not behave as if these solutions are available until we can be damn sure they are. Secondly, what techno-optimists and cornucopians alike never grasp is that there are in fact things that are technically impossible and will always be. Sure, we can go to the moon now with a rocket (at tremendous risk and expense). But Jule Vernes' moon cannon was never invented, because its technological principle is guaranteed to kill its passengers, simple as that. Some things are not possible yet, and some things are impossible in principle, and it would be foolish to the point of madness to assume that everything we would like to have and do falls into the first category.
Massimo, I'm confused about your claim that transhumanists' arguments are "irrelevant." Your reasoning is that it's already inevitable that we are going to do everything we can to pursue the goals of staving off death and disease. But the whole point of my post was that there are plenty of influential people who believe we should NOT be trying to stave off death.ReplyDelete
And you yourself argue against transhumanist goals for other reasons, citing potential negative repercussions, and arguing that our money would be better spent on other things. I don't think you can have it both ways -- you can disagree with transhumanist goals, but then their efforts to change your mind aren't "irrelevant."
Julia, who says I can't have it both ways? That's how I run my life! ;-)ReplyDelete
Seriously, I think there are good arguments (including, but not limited, to mine) for why transhumanism is a bad idea.
But I also do think that those arguments are irrelevant because human motivation is what it is, and people will naturally try to extend their lives as much as technologically possible. I still don't think that will bring about the transhumanist scenario, and I still don't think it's a good idea, but a movement in that direction is in fact irrelevant (and anti-humanist in nature, see my previous comment).
"For starters, even if population growth were lowered to 0.1% per generation, we would still grow to the point of starvation eventually."
This is true, but it assumes we're bound to our planet and that we wouldn't lower our birth rate enough to stagnate our population growth. We could also conceivably have such a low population growth rate that the planet would become inhabitable before overpopulation became a problem.
About priorities, I agree that this should be on the backburner and haven't said otherwise. Also, I'm also not a transhumist and "having my mind uploaded" would result in my death and a doppelgänger floating around in cyberspace. I don't want that.
I certainly didn't mean cold fusion and I never said cold fusion. We definitely don't have the science required for that. But we do for fusion (see http://en.wikipedia.org/wiki/ITER) and we do for a nuclear powered spacecraft. There are only engineering hurdles in both cases.
Yes, population growth will shoot up dramatically if mortality rates dropped *while* birth rates remained the same. However, I don't see any reason why birthrates wouldn't decrease.
Population growth will reverse. See the United Nations Population fund which estimates the world's population will peak at 2050 and then decline.
"[...]travels fast enough will be torn apart by the force of individual atoms in the not-entirely vacuum hitting it with nuclear-bomb-like force[...]"
I'd like to see a source for that, please. Cosmic radiation is more of a problem for traveling within our solar system at low speeds. We also can't provide adequate sheilding because this would significantly increase a shuttles weight making it even more costly. Building a spacecraft on the moon or in orbit would solve this problem.
It's not that I think technology will deliver, it's more fundamental. I think the human species will adapt in potentially unforseen ways. My examples are just possibilities to show ways we could adapt. I also don't think we should behave as if these solutions are available until we're sure they are—isn't this obvious? Pigeonholing people into labels like "techno-optimists" reeks of tribalism.
It's been said before: transhumanism is basically an addendum to humanism, motivated by the need to oppose life-hating bioconservative ideas - ideas shared, sadly, even by many humanists!ReplyDelete
Massimo and others says there are too many bigger priorities to be met in terms of human suffering before we set off in search of the "holy grail." Unfortunately the "priorities" argument is one that very few people can pull off without being just a wee bit hypocritical - after all, it's not clear that promoting rationality & critical thinking creates more utilitons than (say) writing a nice big cheque to StopTB, and yet here we all are... because for various idiosyncratic reasons, we have a particular interest in this topic.
I don't think colonization of Mars is all that far fetched (from a technological perspective at least). We should go there if for no other reason than to avoid the Great Filter. That it helps in overpopulation is a nice side benefit.
Charles, I have no idea what the Great Filter is, but Mars isn't going to solve an earth overpopulation problem, short of terraforming. Talk about far fetched technologies that don't have a prayer of being realized any time soon.ReplyDelete
ian, actually, I think a very good argument can be made that more critical and rational thinking would have immense benefits for humanity. Far greater than a quixotic quest for immortality.
hmm, sorry, didn't mean "cold" fusion, just fusion. It is not feasible now, people have tried for decades, without success. That doesn't mean it's impossible, but that wasn't my point. As for why wouldn't the birth rate decrease if the death rate went down: it's called biology, buttressed by religious fanaticism about having sex only the natural way, to create more children of god, etc. etc.
You've changed the qualifier from currently possible to feasible (meaning possible to do easily to me). And fusion is currently possible, the National Ignition Facility is gearing up for the first fusion ignition after a successful test (see http://www.sciencedaily.com/releases/2010/01/100129121823.htm). The point I'm making is that there is no gap in scientific knowledge holding us back from being able to do these things. In contrast with cold fusion or strong AI where there's a huge gap in scientific knowledge.
"As for why wouldn't the birth rate decrease if the death rate went down: it's called biology, buttressed by religious fanaticism about having sex only the natural way, to create more children of god, etc. etc."
Circadian rhythms are also biology; it isn't natural for a diurnal species to sleep during the day and work at night, yet we do it. It isn't natural for a species to stave off reproduction long after puberity, yet we do it. You've fallen into the is-ought problem.
I have a challenge of sorts for all the non-transhumanists here: what do you want for the long-term (~1 000 year) future of humanity? Make your assumptions about technology as conservative as you please.ReplyDelete
What do you think of Treder's answer to your critique of transhumanism? I think it is quite convincing, but I will try to return with a more detailed comment after I have studied the FAQ.
"Charles, I have no idea what the Great Filter is, but Mars isn't going to solve an earth overpopulation problem, short of terraforming. Talk about far fetched technologies that don't have a prayer of being realized any time soon."
The great filter is a hypothesis for an answer to the Fermi paradox. (Why have we not seen any "aliens"?) The hypothesis is that somehow (at least anything but a very short-lived) high-technological civilisation is extremely unprobable in our universe. (It was originally formulated by the physisist turned economist Robin Hansson who is a prominent person at the edge of the transhumanist movement, though I think not exactly a transhumanist. Hanson is by the way a very interesting and important thinker. An example of the type "have some very good ideas but not a very careful analyser in general" that I mentioned in my previous comment.) Hopefully humanity has the critical thresholds alredy in its past, but we cannot trust that that should be the case. It could well be, e.g., that almost all attempts to build high-technological civilisation results in building some such dangerous technology that it kills us all.
Of course terraforming is needed for a real colonisation of Mars. We have ideas of how to do that. Of course we need much more research but I have seen no good arguments why it should not be possible. I think definitely we should do it. And an important part of the strategy should be (I think) to not force it too much, but just wait until scientific research and technological development in general has given us safer and (very importantly) cheaper methods than to attempt it too soon. ("Just waiting" includes of course also continuing sending better and better unmanned missions to Mars and other interesting places in our neighbourhood.) But as I said above, I think we could and should do it rather soon. A few centuries is a very short time in human history. Of course, colonising Mars is not the solution to overpopulation. For that contraceptives should be more effective, and here religion is the real problem, the technology we have already.
"hmm, sorry, didn't mean "cold" fusion, just fusion. It is not feasible now, people have tried for decades, without success."
We have had it working for about a half century already
unfortunately in this form. But controlled fusion was done in the JET experiment, although more more energy was needed to rum the machine than was released by the fusion. The ITER that is now under construction in France will probably give more fusion energy than the energy to run the machine (during the experiment itself, maybe not in total). But the problem is economy, it will probably for a long time be much too expensive energy.
Projects (only sketches actually) as Orion and Longshoot could probably be built by incrementally developed technology from todays. But it would be very expensive. And they are unmanned missions. Manned interplanetary travel can almost certainly be done, but not by just an incremental increase of todays technological capability.
tony, I can't give you a point-by-point rebuttal of Treder (that would take another post, and I think RS has spent enough time on transhumanism for now), but here are some highlights:ReplyDelete
* He mentions but doesn't even try to address my contention that regardless of what is wrong with transhumanism, it is irrelevant (Alex noticed the same above).
* About human genetic engineering, he resorts to the cheap rhetorical trick of saying that I should know better (while accusing me of cheap rhetorical tricks). The fact is that no matter what the "official" transhumanist faq (really, there is an official one?) says, a major tenet of transhumanism IS the permanent alteration of the human genetic line. On what bases are these people going to evaluate the consequences of such move?
* To say that transhumanist leaders are "seriously grappling" with issues of fairness sounds nice. I just don't think this "grappling" can be done seriously unless one agrees to postpone transhumanist goals and devote time and energy to fairness issues first.
* Treder can be as "insulted" as he wishes by my criticism that there are much more important priorities than pursuing transhumanism goals, but that's no response at all. You can't "do both," as nice as it sounds, in any meaningful way, because the problems currently facing humanity are so huge and difficult to tackle already.
* I find it positively hilarious that he dismisses my overpopulation objection on the ground that it is "tired" and that, you know, science fiction writers have dealt with it already!
Treder goes on with the cheap rhetorical trick of accusing me of not having done my homework, apparently without stopping for a moment to even consider the possibility that I have in fact read about transhumanism, and have found their answers to several types of criticism wanting.
Enough said for now?
Why, that is nice, the EU, G7 and UN have never asked me for my opinion.
I guess my priorities would be, in order of descending importance:
- everybody on the planet being much better educated than today, rationality being a virtue, and believing things without good reason being a sign of unfitness for any responsible task or position. This is likely a prerequisite for the rest to be possible.
- gentle reduction of global population below a billion or so (to all those who will play horrified: get it into your heads that this reduction will happen no matter what we do, only then without the "gently" - that is what non-sustainable means)
- the general realization that since about the 1960ies the industrialized countries have been in a situation where everybody can live extremely comfortable; the further realization that just perhaps we should not have a large number of unemployed sitting next to another large number of highly qualified people working 40-60 hours weekly to drive an economic engine producing ever new gadgets (giant plasma screen? robotic vacuum cleaner-cum-manservant?), but that increases in productivity should perhaps be used to reduce the workload to, say, 20 weekly hours for everybody, so that everybody feels useful but can spend the copious free time to do the garden, or take the family out to a public bath, volunteer at the local nursing home, or take a belly dance course. You know, things that make you human instead of a cogwheel.
- the further realization that we have, again since about the 1960ies, all the resources and productivity to eradicate poverty worldwide, and that we should finally do that. Everybody to get the very basics needed for life - lodgings, food, clothes, dignity - so that they do not have to rely on their clans or churches when things go bad.
How is that for starters?
Massimo, I'm with Julia on this one. I have "many" objections to your post, but I'll just note a few:ReplyDelete
Zeroth: Your comment about chimps is a logical fallacy. I "can" imagine some trans-humanized species with which I "would" like to have sex, even though she (it?) lacks human nature. Not all non-humans are (or will be) chimps.
Second: Perhaps futurists have been mostly wrong, so what? They could get it right this time, it's been known to happen, see Jules Verne!
Third: The economy is a human creation so we haven't been "playing" with it, we created it! And it has been enormously beneficial to us. And we haven't destroyed the environment, not yet anyway.
Fourth: Life has never been "fair", you live a much better life than most people, even here in the US. Also, any technology can be used for nefarious purposes, like the weaponization of anthrax. Does that mean we should not be involved in medical research?
Fifth: do you know how much R&D money and brains went into your iPad? Should we prioritize that away too? How about fertility drugs? Acne medications? ...
Sixth: Of course there is danger, but there is danger in ANY technology. Somehow we haven't nuked ourselves yet. And probably never will.
Oh, boy, I have so much more, but I'll stop here, at least for now.
Alex: a damn good start.ReplyDelete
Out of curiosity, whence the 1 billion number for earth's population? It seems low to me, but I've never researched the topic.
Do you think modern humans are capable of achieving item #1? (Not a rhetorical question, I really wonder).
"I have a challenge of sorts for all the non-transhumanists here: what do you want for the long-term (~1 000 year) future of humanity? Make your assumptions about technology as conservative as you please".
I want energy and food independence. By this I mean renewable energy sources that can be implemented at the individual or small community level. The same with food. I want sustainable food production that can be managed by individuals or small communities (I do not mind having robots helping with food production).
Hopefully it will not take 1000 years to achieve this. Once we have that, I'll take everything else: laser vision, mind uploading, massive life extension, whatever.
You might remember a series of papers on the major fundamental transitions in the history of life that I sent you about a year or so ago (Nature Precedings). Well, I want to surprise you with another ‘transition’ that might be within the scope of your discussion. Please visit a recent website that I released at: http://www.ihumansproject.org. I hope you’ll take some time to reflect on this extraordinary project and that you’ll find it inspiring! In any case, it will be great to hear your thoughts.
Since you're kind of asking a 'what if you had a million dollars' type question that's how I'll answer.
Transhumanism is at best a fanciful projection of natural desires for health, happiness, and longevity onto our relationship with technology. But I think it makes several crucial mistakes.
1. Technology is deterministic
2. Technology is inherently good, only people make it good or bad
3. Related to 2, technology is at the top of a hierarchy of utility.
4. Technology carries no cultural values
5. Related to 4, technology doesn't need socio-economic filters (that's a
6. Deskilling isn't an issue
7. Death is inherently bad
So, In a thousand years I'd like to see humanity recognize their inseparability from technology without falling into any of the related traps listed above.
I'd also like to see us come to an agreement about how to live in respectful relationships with each other, animals, ecosystems etc. while keeping up discourses on technology, science, art, etc.
The sad thing is, and I genuinely don't mean offense, is you probably don't see technology's separability from science, or art for that matter... or to put it better, it's current, previous, or future mutability.
Do you even know the why of the how you came to be on a computer (presumably) having a discussion of sorts with people from across the globe?
And since apparently no one looked at it, I would suggest checking out the video link I posted above. Mr. Warwick's views are the very reason why opposing even innocuous transhumanism is as important as opposing religion.
I apologized on this blog for my comparison between religion and transhumanism before, but that was a mistake. It is a religion without a deity but none of the patience of Buddhism.
I'd like to hear transhumanists' perspective on the feasibility of brain implant. I know transhumanists have a vision of brain upload. But any idea on how we actually do it?ReplyDelete
Benny, my friend, there are so many things wrong with your response... Here is a sampler:ReplyDelete
1) The chimp analogy is not a fallacy (let's not throw that word around too easily, it has a precise meaning). The question is not whether you wouldn't mind having sex with a "superior" genetically engineered individual, it's whether she would. And I bet she would, simply in virtue of her self-perceived superiority. (As I said, I consider transhumanism a stupendous exercise in narcissism....)
2) Yes, futurists might get it right some time. But it is rational to look at their track record, just like you would do in picking a dentist. And given their track record, I wouldn't entrust your teeth to that bunch, if I were you. Besides, as Alex pointed out above, Verne was actually wrong: we didn't go to the moon using a cannon; and we will never go to the center of the earth. (Incidentally, read Verne's last novel, Paris in the 20th Century, for a much bleaker view of the future by the French novelist.)
3) Of course the economy is a human creation (where did I say otherwise?). But if you don't think we have been playing dangerously with it of late you have not been paying attention.
4) Of course life is not fair. It is an ethical duty of human beings to make it slightly more so. Unless you are a moral relativist or a selfish bastard, and I know you are neither.
5) Ah, the iPad. Benny, I'm not as naive as to imagine that it would be possible - or indeed even desirable - to prioritize all our effort, no matter what the field. Besides, I love my iPad. But suppose we were talking about a huge amount of resources to be pumped into the search for extraterrestrial intelligence, or a trip to Pluto. We would (and should!) be having that discussion. And what the transhumanists want to achieve - assuming that it is even possible - is on a much larger scale than that.
6) Of course there is danger in every technology. Does that mean we shouldn't be having a conversation about it? If someone proposed to put billions into developing a type of weapon that could wipe out life from the planet, and that technology had no other application, wouldn't we want to talk about it? Oh, right, too late...
Massimo, you said: "I think there are good arguments (including, but not limited, to mine) for why transhumanism is a bad idea. But I also do think that those arguments are irrelevant because human motivation is what it is, and people will naturally try to extend their lives as much as technologically possible."ReplyDelete
OK, I see; I didn't realize you were including your own arguments under the "irrelevant" heading too. So if I'm understanding you correctly, it sounds like you believe *any* argument about life extension is irrelevant because people are going to do it anyway.
I'm skeptical of that -- do you really think that if the majority of president's council believes life extension would be a bad thing, that has no impact on what the administration decides to fund/allow?
Our government frequently makes decisions about what kind of research to allow, or not to allow (see: stem cell research), and individual scientists make decisions about what kinds of research to devote their careers to. You don't think that public argument about the desirability of life extension affects those decisions? I'm skeptical of that, but at least I understand your argument better now.
Julia, you are presenting my statement as a caricature. No, of course I don't think that every argument about transhumanism is irrelevant, and certainly not my own!ReplyDelete
I just think that IF some of what transhumanists want to do is possible, it will happen anyway, one way or the other. Your examples are good ones: stem cell research is still going on, in other countries, or in the private sector. So, yes, what Congress or the President think is obviously relevant, but only locally.
Besides, a lot of the research relevant to transhumanist goals is basic research that is being conducted by independent labs, many that are in fact federally funded. For instance, research on life extension and aging. It is the transhumanist movement that is irrelevant to all of this, not the arguments or the research.
Massimo, I truly didn't mean to caricature your argument -- I was just reading directly from your quote. You said "I think there are good arguments (including... mine) for why transhumanism is a bad idea. But I also do think that those arguments are irrelevant."ReplyDelete
Did you forget the rest of that quote? If you stop it at "irrelevant" it makes no sense, if you wait for the explanation, it's a bit more intelligible... ;-)ReplyDelete
No, I didn't forget the rest of your quote -- I included it in my previous comment, the one at 11:18.ReplyDelete
You argued that stem cell research is still going on in other countries, or in the private sector. Sure. But you have to admit that prevailing attitudes about stem cell research in our country reduced the rate of progress in the field. Transhumanists would say the same about life extension research (and given how fast people are dying, they would argue that even small delays matter a lot).
Julia, I'm not making any absolute statements here. Yes, of course if the US Government opposes a particular type of research then progress in that area will be slower. But my basic point is that a movement in this area is (largely?) inconsequential, not that the US Government doesn't have any effect. Ironically, this is because people are already doing the sort of research that transhumanists advocate, and transhumanists themselves don't bring anything tangible to the table - as far as I can tell -in addition to what is already being done in countless biology and computer labs.ReplyDelete
I do not know what precisely would be the maximum number that is sustainable. On the other hand, it seems sensible to stay at least a bit below it to buffer against crises. Also, the less numerous our descendants are, the more comfortable can their individual lives be for the same ecological footprint.
As for your other question, I currently have a much dimmer view of human nature than I did in my youth, and many who know me would describe me as a pessimist. But then, there seems to have been a lot of progress towards increased education and rationality over the last few hundred years, so maybe that pessimism is unwarranted.
It is really hard to make predictions about the future, which is also why I would concentrate on desirable social developments instead of expecting specific technologies to be developed. Bringing us back to the original post, of course.
After rereading Julia's and Massimo's posts, as well as the comments and, in particular, taking a look at Treder's response, the Transhumanism FAQ, and some of the discussions Treder posted (such as the one between him and Jamais Cascio), it's become evident to me that no further breakdown of Massimo's arguments are necessary.ReplyDelete
Massimo, with all due respect,and I say that sincerely as a fan of your new book and of your blog, you are straightforwardly wrong on this one, and most of the arguments you've presented are so weak I'm not even sure they deserve the dignity of being called "arguments" - several of them aren't even recognizable as criticisms of transhumanism!
I think it's worth quoting one portion just to fully consider it:
“You do the math: unless transhumanist immortality comes with mandatory castration (try to sell that to your fellow human beings!), it would take a very short number of years to cause a planetary collapse due to overpopulation.”
This is a false dichotomy at its worst: exhorting the reader to reject transhumanism because to accept it would necessarily entail also accepting either "planetary collapse" or "mandatory castration" - if insisting the only options open to transhumanists are coercively hacking off everyone's genitals or else face the apocalypse isn't hyperbolic absurdity at its finest, I don't know what is; and to evoke a particularly emotionally charged false dichotomy is especially loathsome for a philosopher. This is sophistry, and you do a disservice to everyone holding a PhD in philosophy to brazenly commit so egregious a violation of reputable discourse.
This sort of nonsense is beneath an otherwise respectable and serious philosopher and defender of reason. It's one thing to simply present bad arguments in a sincere effort to offer a critique, but it's especially deplorable when they're presented with the sort of cavalier attitude that permeates your post. Indeed, your entire post comes off as smug, condescending, dismissive, and, to be frank, half-assed. Why bother bringing up a topic for discussion if you don't even intend to treat it seriously and fairly? Massimo, you owe an apology to all those readers who expect a person running a blog called “Rationally Speaking” to...speak rationally.
with all due respect, I don't know anybody any apology. I think my arguments are both strong and pretty convincing, and that there is no logical fallacy involved. You disagree, fine, I'm not expecting an apology from you, am I?
Massimo wrote ...."Fortunately, I don’t really think transhumanism is a threat to anyone, just like no futurist has ever been. These movements are populated by naive optimists with a fairly high degree of narcissism, but they are otherwise mostly harmless."ReplyDelete
I think, you are wrong. It's a threat to digital art student like me who want to study the technique and method in making art (or taking some relevant computer science courses). It's a another academic fashionable nonsense like post-modernism that will waste time and money of the students. Some professor will push some BS courses to the curriculum.
See example here. A called for post-humanism here in Chicago
Of course, if you are not subscribe to this BS movement, there's a chance that your work will never be seen or heard.
I read your post and I agree with some points that you made. The followers of this so called Tranhumanist movement may be optimists indeed, but the pioneers that are spearheading this operation are much more than optimists. There is a lot of money backing these optimistic dreams and for that reason I believe it is plausible that some of these future predictions of theirs will appear in our lifetime. I am actually speaking a lot about the subject on my own blog: http://reedshakinginthewind.blogspot.com.br . I am not a luddite myself, but I am a person that can connect the dots and follow the money. If you do those things and you see the sincerity in the hearts of transhumanists and the pioneers you can paint a different picture to say the least. Check out my blog and comment if you wish. Thanks. (Reed shaking in the wind)ReplyDelete