James Miller has an interesting looking new book out, Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World. I haven’t had a chance to pick up the book yet, but I did listen to a very engaging conversation about the book at Surprisingly Free.Miller is a true believer in the Singularity, the idea that at some point, from the next quarter century to the end of the 21st, our civilization will give rise to a greater than human intelligence which will rapidly bootstrap to a yet higher order of intelligence in such a way that we are unable to see past this event horizon in historical time.
A few hundred thousand years of human evolution is arguably worth risking for billions of years of humanity’s future. Also, I’d be happy to informally discuss the ethics with you. The poor unfortunately will lag behind, but look at cellphones in developing nations; they will join us surprisingly quickly in these incredible new technologies. As for the biosphere, we place it on par with the thousands of other issues which can and will be solved by stronger minds. I, and some smarter folks than myself over at Less Wrong, would argue that the Singularity, done right, is ethically the best thing humanity has ever done. Hard not to be a little proud of fighting to make it happen. Party on!
Posted by SHaGGGz on 01/12 at 04:07 AM
This article is riddled with such simplistic and ridiculous caricatures of what singularitarianism entails that it’s hard to believe it was written by a sf author who is presumably familiar with the concept and movement. This is particularly puzzling when considering that he is published on IEET, the foremost proponent of democratic transhumanism (which recently published a poll finding roughly equal parts liberal and libertarian ideologies among transhumanists), yet asserts that singularitarians “are libertarians” uninterested in “the painful process of debate, discussion, and compromise,” ignoring countless organisations such as this with exactly those goals. Aside from a few fringe cases, they do not “take it on faith that everything will work out as it is supposed to.” Even the poster boy Kurzweil, always accused of unrealistic optimism, acknowledges the need for significant debate, that there will be problems, and only gives humanity a “better than even” chance of surviving. “Would you risk the continued existence of the entire human species if the the payoff would be your own eternity?” is not the right question to ask. These technologies are coming regardless of whether you feel they are worth the risk, and sticking your head in the sand because their implications make you uncomfortable is not going to help matters any.
Respectfully, the problem I see with the sigularians is precisely their willingness to risk not just their own but all of our futures in pursuit of their goals. Your ” few hundred thousand years of human evolution” is the life of not just myself, but my daughters, indeed all human generations in the future should the most dismal predictions of the Singularity that as evidenced in my post are often blithely brushed aside. At some point in the future I would be more than glad to debate and discuss these issues with you. Perhaps it is something we could do over SKYPE?
I am not sure how to square your statement that “These technologies are coming regardless of whether you feel they are worth the risk..” with your assertion that transhumanism is a democratic movement. Certainly all sorts of transhumanist’s goals-
such as neural implants for the physically healthy can, if we so choose, be subject to public regulations just as drugs or steroids are today. If I was “sticking my head in the sand” I would not be arguing that we need to have a serious public debate on these issues as I hope we are engaged in right now.
Posted by SHaGGGz on 01/13 at 08:55 PM
You can square those statements by recognizing that regardless of whether we develop these technologies through democratic means, the self-amplifying nature of the technologies and the game theoretic dynamics of our global civilization mean that it is very unlikely that these technologies will be suppressed. Also, I didn’t assert that transhumanism is a democratic movement, I rebutted your claim that it is a libertarian movement. It is a diverse movement and will only get moreso as it creeps ever further into the mainstream.
I am crossing my fingers (and doing my small part) to help make transhumanism a more democratic movement in the sense that it should be open to dialogue with those who do not share its premises.
Above all, I would like us to move away from this false technological determinism where we are somehow fated to arrive at one destination or another. We make our choices and decide our future- at least for now.