IEET > GlobalDemocracySecurity > Vision > Staff > HealthLongevity > Enablement > Mike Treder > Futurism > SciTech > Resilience
Singing the Singularity
Mike Treder   Jul 16, 2009   Ethical Technology  

Like many a useful concept, the Technological Singularity has become over-invested with emotion, ideological leanings, and tangential agendas. Can its value be recovered?

On October 3, 2009, the fourth annual Singularity Summit will convene, this time in New York City. Among the speakers featured in the two-day event are IEET fellows Ben Goertzel and Aubrey de Grey, along with Ray Kurzweil, Anders Sandberg, Robin Hanson, Eliezer Yudkowsky, Greg Benford, and many others.

So what’s it all about?

Humming the Tune

Organizers of the Summit describe the Singularity as “an ‘event horizon’ in the predictability of human technological development beyond which present models of the future may cease to give reliable answers, following the creation of strong AI or the enhancement of human intelligence.”

Our IEET Encyclopedia of Terms and Ideas offers this definition: “The Singularity is a theorized future point of discontinuity when events will accelerate at such a pace that normal unenhanced humans will be unable to predict or even understand the rapid changes occurring in the world around them.” It is assumed, usually, that this point will be reached as a result of a coming “intelligence explosion,” most likely driven by a powerful recursively self-improving AI.




In his book The Singularity is Near, inventor and futurist Ray Kurzweil says:

What, then, is the Singularity? It’s a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts that we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future.

And, finally, the Summit organizers write: “While some regard the Singularity as a positive event and work to hasten its arrival, others view the Singularity as dangerous, undesirable, or unlikely. The most practical means for initiating the Singularity are debated, as are how, or whether, it can be influenced or avoided if dangerous.”

Dubbed by some, disparagingly, as a Rapture of the Nerds, the concept of the Technological Singularity is often derided as wish fulfillment for perpetually adolescent (and typically male) sci fi dreamers.

Unfortunately, such dismissive criticisms, deserved or not, tend to draw attention away from the beneficial aspects of futuristic speculation.

We’ll explore those benefits, but first let’s consider the downsides of ‘singularitarianism’, defined by one of its leading advocates as the belief that “the Singularity is possible and desirable.”

Hitting the Wrong Notes

Insisting on the possibility—or, even more strongly, asserting the inevitability—of an uncertain and debatable but incredibly momentous event leaves proponents vulnerable to a charge that they lack rigor and discipline in their thinking, that they have fallen prey to hopefulness and shed any semblance of healthy skepticism. If they cannot restrain themselves from heartily endorsing an unprovable proposition, than what credibility have they for other declarations or recommendations they might make?

This is, of course, a familiar dilemma for many futurists (especially transhumanists like myself) who choose to speculate about matters both unclear and controversial, and who can’t do what we do without risking criticism and even mockery. The fact that these areas are still hazy should not restrain us from trying to shine a light ahead into the mist, but at the same time an emphasis on clear thinking and scientific rationality is greatly advisable.

More troubling is the suggestion by some (all?) singularitarians that the outcome they seek is not only possible but desirable. Given the substantial amount of uncertainty—which they themselves admit—surrounding the nature and impacts of such an occurrence, it seems imprudent to stamp the Singularity as unquestionably “a good thing.” Worse yet, some who proudly say they’re working to bring about the Singularity have the temerity to proclaim that they alone hold the keys to making it a “friendly” event. Besides sounding childishly naive, such a claim also invokes the specter of technocracy: if only all the big issues of this world were left to the few really smart people to solve, everything would turn out fine. It’s implied, moreover, that meddling democrats and pesky government regulation will only slow things down and might even prevent the smart singularitarians from saving the day.

In view of all the damage done by a self-labeled “best and brightest” during the last few centuries, it’s surprising that this kind of doctrine still attracts adherents.

Those who promote the idea that a Technological Singularity is not only possible and desirable but that its advent can be hastened through our efforts must be aware of the obvious parallels between their own beliefs and those of Christian Millenarians. And in addition to courting ridicule, this “bring it on” position tempts an assumption that whatever problems we have today will be fixed soon enough, so why worry about them now? Poverty, disease, malnutrition, war, pollution, climate change—all these will be simple matters for the superintelligence that we (or, rather, they) are about to create.

Finally, the proposal that a Singularity can be managed for “friendliness” seems hopelessly hubristic. Prometheus did indeed steal fire from the gods and we have used it in countless beneficial ways for humanity, but we have also wielded its power to kill millions of people down through the centuries and continuing today. Nearly every significant technological advance has been dual-use, providing both opportunities and threats. We have not yet learned how to tame ourselves and our impulses well enough to prevent horrible things from happening with the aid of technology, so why should we assume that we will do so in the near future?

Getting Back in Key

Having said all that, it’s important for me to state that: 1) I do think it’s reasonable to expect some sort of intelligence-driven discontinuity, probably before the end of this century; 2) we are well-advised to learn as much as we can about this possibility; and 3) we should seek whatever ways there might be to influence events toward a mostly positive outcome. I differ from the singularitarians in my skepticism about our chances of keeping complete control over what will happen, and I certainly try to separate myself from those who offer a blithe assurance that since future technologies will easily solve our big problems, we can comfortably ignore them today.

If, however, futurists are able to restrain themselves from unbridled techno-optimism (as well as from cynical techno-pessimism), if they can maintain a healthy skepticism toward unsubstantiated claims, if they can promote and demonstrate reliance on scientific rationality, if they will assiduously resist the siren call of technocracy, if they understand and emphasize the differences between their own work and the fantasies of religious rapturists, if they strive for humility and an honest recognition of their own human limitations, and most of all, if they will recognize that nothing is certain and that working to relieve suffering today is every bit as important as chasing the promises of future technological potential, then discussion of a Singularity and its pros and cons is a worthwhile effort.

Mike Treder is a former Managing Director of the IEET.



COMMENTS

Really interesting, but it is necessary always a healthy skepticism, because the BIG change maybe it is gonna be a really sloooow change. If you know what I mean…

What I really doubt is that Singularity will be wholly ‘intelligence-driven’, , intelligence is only one aspect of cognition, the dogma of Sing Inst is that intelligence alone is sufficient to drive recursive self-improvement, but I think this assumption should be questioned more Mike.  What other cognitive elements may be required?

(My own strong suspicion is that reflection and consciousness may prove to be important).

As Michael Vassar amusingly suggested on ‘Less Wrong’:

“Almost everyone really is better than average at something. People massively overrate that something. We imagine intelligence to be useful largely due to this bias. The really useful thing would have been to build a FAS, or Friendly Artificial Strong. Only someone who could do hundreds of 100 kilogram curls with either hand could possible create such a thing however. (Zuckerberg already created a Friendly Artificial Popular)”

People should probably focus on the aspect of cognition that they are best at - for me personally, that’s not rational intelligence, but rather things like analogy, narrative, ontology, reflection;

For the super high IQer’s its probably appropriate for them to focus on IQ, but what they should not do is make the mistake of thinking that IQ alone gives them the keys to history.

Hi Mike, great article. I will try to attend the Singularity Summit in NYC this year, will you be there? I look forward to seeing you guys.

Would a Singularity be a “good thing”? Well, from the point of view of a child, is growing up a good thing? I think in most cases it is—basically life is what we make of it, and some people screw it up, but I (wish to)

Edited excerpt from a comment I wrote on Max More’s blog:

http://strategicphilosophy.blogspot.com/2009/06/how-fast-will-future-arrive-how-will.html

I never believed too much in a hard takeoff, exponential Singularity. As an engineer, I know that the real world is not simple, clear and pristine like a mathematical equation, but often messy, chaotic and greasy. So while I do expect an overall exponential trend, I do expect one with roadblocks, false starts, backsteps etc., which will result in a fractal rising at a rate halfway between linear and exponential.

Concerning optimism, rapture etc.

For the reasons above, I always found Ray Kurzweil’s predictions way too optimist and “clean”. But I also find them refreshing against the often overcautious and defeatist attitude of other writers. I am not as optimist as Kurzweil, but I see him as the bard, the storyteller, the provocateur who shows wonderful possibilities to the rest of us. This is something good and, of course, things will be what we make of them.

I am sometimes troubled by the casualness some people talk about the potential of a singularity. First and foremost I would like to remind everyone that if a singularitarian “transitional phase shift” is possible (and self-generating) it will be diverse. In dramatic times a small fractional difference will have amplified results; likewise I fully anticipate any potential singeys to be fully random.  It’s like rolling four dice and multiplying them. where so far we have been adding them up.

It gets very weird and very wild results.

Think about that for a second. We have no way of knowing what and if values we cultivate today will be translated beyond the singey. I think they will be translated in very stark terms.

I think that if we have fundamentally hardwired values iu our economies, hardware architectures, defined goals and paradigms these will not only be retained by the systems beyond the singey, but also self-amplified into potentially grotesque and nightmarish shapes.

Imagine if Vista becomes a permanent fixture of the postsingularity world!

Can we afford entering a singularitarian era with free market capitalist Darwinism as a ruling paradigm? Can we afford unleashing take-off on a world where excluding billions from any level of humane existence is the norm? Can we even dare to beg for considerations from whatever is in charge after a singularity when right now we have a unique talent to spamblock the screams for help in the current era? Can we afford to have something like the pentagon - or big oil - or polemic, feudal, dualist, divisive, corrupt politics in general to shape transcendence?

As a simpler example - how many things could have gone slightly different when we were developing the internet. A sick day, another job interview, a slightly different stock exchange rating - all these things could have aborted or delayed the emergence of global internet, or could have given us a completely different network interaction.

There is no reason to assume that right nos we live in the best possible world the copenhagen interpretation allows us. Sure in this timeline we have world of warcraft, but we also have ringtones…  what roads did we miss?

I do not want to end up in a world where petty envy, tribal contempt, racism, avarice, corporatism, competition all go nova and raise demons.

I expected more of Mike Treder than either being

A) unaware of what singularitarians actually think

or more likely

B) misrepresenting what singularitarians actually think since he finds misrepresentation politically useful.


Insofar as by singularitarians we mean the folks holding SIAI in high regard, I’m not aware of a single person assuming the singularity will necessarily be a good thing. Not one!

It’s quite weak that Treder is quoting from an outdated declaration Eliezer Yudkowsky wrote 10 years ago as a teenager, and claims that represents the opinions of “a leading advocate”. Eliezer has made it very clear he does not agree with his teenage-self, who indeed managed not to see some risks. (At a point, Treder implicitly shows he knows this.)

Really, it’s hilarious that the definition used to characterize singularitarians in this article comes from a teenager from 10 years ago. Couldn’t find anything more up-to-date?


At the very least, it would be nice if Treder managed not to blame the very same people (yes, you are referring to the same people) for

(1) assuming the singularity will *necessarily* be a good thing

and

(2) being hubristic in claiming that if carefully managed, the singularity *can be* a good thing.


Politics truly appears to be the mind-killer, since Treder is sinking this low.

“What, then, is the Singularity? It’s a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts that we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future.”—Kurzweil

If one were to list factors that would irreversibly transform the concepts that we rely on to give meaning to our lives, and our perspective on the significance of our past, I would put advanced technology much further down the list than our religious (or lack thereof) beliefs.

Mike, I must say that I don’t see many traces of this “dogmatic singularitarianism” in current writings. Most people are saying that a Singularity _may_ happen, and that it _may_ be a good thing. Two statements I can agree with (though I think it may not be that easy, as in my previous comment).

In these comments there has not yet been mention of Michael Anissimov’s thorough reply to Mike Treder:

http://www.acceleratingfuture.com/michael/blog/2009/07/making-sense-of-the-singularity/

Recommended reading for those who want to know what singularitarians actually think.

@Khannea Suntzu:

The concept of willing exchange of goods and services does not imply avarice, heedlessness, or viciousness. These negative qualities arise from shortcomings in individuals and are seen just about anywhere in the world you look, although their prevalence is affected by cultural norms.

Our drives and instincts, and what might be called the moral “set point” within us needs adjusting. As we come to understand the brain better, this possibility gradually begins to sound less far-fetched. If this change can take place—willingly, of course—I have no doubt that we will see far more peaceful cooperation, and without the need for elaborate constraints on freedom, whether intellectual, medical, political or economic.

Michael Anissimov’s thorough reply to Mike Treder

Too defensive imo. Also, Mike does have some valid points, though I would agree with you that he is basing his assessment on outdated material.

Seems to me that there is a real lack of self-awareness in most of the defensive S^ hand-fluttering about Mike’s comments.

In the first place, about a 10000x more people are fans of “The Singularity” because they read about it in Kurzweil’s work than because they participated in some SIAI chat forum. So to dismiss criticisms of the popular appropriation of Kurzweilian ideas and insist that they do not address the alleged sophistication of the 30 people in your local cell is like some member of the Rosicrucians insisting that Dawkins didn’t take them into account in his dismissal of Christian credulity.

Secondly, it doesn’t matter that Eli discovered that being an uncritical fan of the Singularity - whether it meant apocalypse or paradise - was stupid, because the vast majority of fans of the Singularity have not yet come to that realization, and his own fan boys are still extremely confused about the point. Anissimov excepted, since Michael has wrapped his mind around the idea that global technological regulation and bans are in fact warranted if the alternative is the end of intelligent life on Earth. The vast majority of fans of the Singularity are simply that, fans of the Singularity. They want to bring it on!, and they generally dismiss the possibility of public policies to mitigate AI risks.

But the much bigger problem however is that the whole Singularity debate is weighted down with parochial assumptions about how, where and when >H AI might occur.  For instance Ben Goertzel says that it is possible that >H AI, or at least forms of a-life, might emerge from the undesigned ecology of the global Intertubes. If that is the case, friendly AI is at best a strategy for beating the emergent A-organisms to the evolutionary punch.

There is also the presumption of uncontrolled AI boot-strapping, which assumes that every a-life will have the same impulses to global mastery that the average 20 year-old male Singularitarian has.

>H AI is a possibility, both for good and ill. But the existing “Singularitarian” community is very narrow and parochial in its worldview. It can benefit from a great deal more dialogue with adjacent tech policy fields, such as - first and most importantly - the cybersecurity community.

The Summit looks male-centric and not actually cross-disciplinary.

(yawn)

Yep.  The Summit is a boys’ treehouse and a non-event in terms of biology.  My non-quantum microtubules are non-excited.

Good question, but no, I don’t think so.  An empirical study will reveal that the Singularity is saturated with maleness. 

Nonetheless, I can easily think of several womanly voices that have enough chutzpah to bring an exciting dimension to the venue, including myself, and of course Athena.  The Singularity has far more curves than the straight, erect line it is trying so darn hard to expose.

In fact, for the Singularity to approach a wider discourse and public appreciation (which obviously is what those in charge are trying to do with all the business executives, etc.), it would be wise to include more rounded voices.

You know what the official line will be: “We couldn’t find any qualified ‘females’.”  Considering how many women people like this tend to know, they may well be telling the truth. Then again, looking at some of the participants, you wonder how high the qualification bar was. 

As for the gung-ho Singularitarians, they’ll grow up.  Or not.  If not, they’ll look more pathetic at 50 than they already look at 20.

@James: i think we should make a difference between “uncritical” and “enthusiastic”, it is not the same thing. I can be very entusiastic about a planned camping trip to a cold wild region, but not so uncritical that I don’t take warm clothing and survival gear. Looking forward to a challenge with drive and pleasure does not necessarily mean not acknowledging the difficulties.

@Natasha and Athena: what do you propose to do about it?

20 and 50 years old people: I am over 50. While I am probably wiser than when I was 20, I am certainly much less energetic, open, enthusiastic and vital. I am not sure the overall change is good. If energy without wisdom is useless, wisdom without energy is also useless. What I wish, of course, is to recover the energy without losing the wisdom. I submit that we should also wish the same as a group. I often think that 20 yrs olds are always right by definition.

The harsh conclusion anyone - everyone - will face that, while life is good for some, it is unacceptably bad for far too many. Sure, dismiss that simple fact, but everyone will have to face the dark end at one time, and when you do, you’ll be shamed into silence.

So don’t ! Hope for a singularity might be akin to hope for collective technological electroconvulsive therapy - frankly, the world as it is has very little too lose. People have very little to lose - hell, even the ecosystems have very little to lose. It isn’t a picnic for most of life. So there is a huge number of people when faced with a collective reset would assume they’d have it better than they currently do.

So all of the idealists here - do not respond to this with that same SSRI/cocaine smile extropian optimism I have seen so often. Instead look into the dark and end try and sustain that sciientology veneer of stepford borg optimism in the african congo, walking past a market selling bushmeat gorilla and lip-amputated and gangraped refugees. And then tell me “existence is wonderful” without flickering eyelids.

I don’t care if a singularity is bad. Even if we’d have a terminal one tomorrow “75% likely to kill everyone” if I pressed this big red button here, I’d have hammered it through minutes ago.

I say bring it on.  I’d rather have an existential reset than more of the same.

So face it - there are people with a completely unorthodox and potentially terrifying approach to a singularity. In 15-25 years time, when the idea of a ‘powderkeg transition’ is a lot more plausible, humanity will have to live with a minority who will prefer gung ho over status quo.  What are you going to do, outlaw desperation? Introduce color coded warning charts over despair? Outlaw favelas?

At some point no amount of well-meaning and sincere annisimovian warnings will save us, and someone in a garage will throw that switch. The only way to make sure they don’t is to make sure they have something to lose.

The future will be written by the Niko Bellices of the world.

Guilio, I think it only comes across as defensive because you disagree with some of the points.  How could one not be defensive at being called “childishly naive”?

Mike and Natasha, we are seeking a female speaker, but it’s a simple fact that there aren’t that many female researchers in the fields of artificial general intelligence or decision theory.  Would you rather we picked a token female with nothing to say than one with the proper credentials and original ideas?  It’s worth noting that out of the three current full-time SIAI researchers, one is female, and that the orchestrator of the SIAI Summer Fellows Program is female (Anna Salamon).  I myself know many female researchers, but these fields (AGI/decision theory) are mostly male-dominated, and that isn’t our fault.

James, the majority of people seriously devoted to the Singularity idea take a much more nuanced view than the Kurzweilian one.  It’s false that the majority of “Singularity fanboys” in the (large) circle of SIAI supporters are against any form of policy or regulation.  SIAI intern Carl Shulman recently presented on the idea of “arms control and intelligence explosions” at the European Conference on Computing and Philosophy, for instance.

As for “uncontrolled AI bootstrapping”, that idea is based on technical arguments that really have nothing to do with politics.  Read Omohundro’s “Basic AI Drives”, for instance.  It would make sense to criticize these arguments on their own grounds, but dismissing hundreds of pages of game theory-based arguments as being derived exclusively from hidden political submotives is merely a reflection of your own tendency to view everything as politically based—it’s not really the way that people who study game theory or decision theory necessarily think.

It seems like there is some ageism with regard to the political-transhumanism and Singularity-transhumanism clique differences.  The average age among the former seems to be 50 and among the latter, the late 20s.  The difference might be that one group views transhumanism in the lens of the politics of their youth and the other views politics in the lens of the transhumanism of their youth.  I am an unqualified supporter of the latter.

Natasha, let me clarify that I don’t think that you or Andrea would be token speakers, it’s just that your fields are not part of the focus for this Summit.

Khannea, you seem to have a consistently dismal view of the current world and the future, which seems to be mostly related to your personal situation, which I admittedly don’t know a lot about, but it seems to be dismal since you keep saying so.

And Mike, of course we have tried to secure a female speaker.  It’s disappointing that you don’t even have that confidence in us.  Out of the 6 official paid SIAI employees, (Yudkowsky, Vassar, Anissimov, Rayhawk, Salamon, and Isaac), 33% are female.  That is a higher ratio than the percentage of IEET fellows which are female.

@Michael: actually I agree with most of the points in your post, but you do sound defensive, to my ears at least. Mike disagrees with Nick on some issues, so what? I am also in the IEET and at times I disagree with both. Friendly disagreement is better than artificial consensus imo.

For example, I often criticize the current wave of Kurzweil-bashing and Singularity-bashing popular among “mature” and “reaponsible” transhumanists. If naive uncritical enthusiasm is extreme, over-cautiousness without enthusiasm is also extreme.

@Mike: “The point is that the world is a messy place, and technology alone—no matter how superior it might seem—is only one factor among many that will determine our fate.

On this point, I agree with you 100%.

The posts that followed mine made my points so eloquently that further elaboration from me would be superfluous.

Regarding the SIAI, all seven directors are male, all but one of its eight advisors are male and only one of its five board members is female.  To say that a third of its coffee-makers are female only highlights the power discrepancy.

Ditto, by the way, for the IEET: All five directors and both senior fellows are male, and only one-quarter of its sixteen non-senior fellows are female.  Neither SIAI nor IEET even meet the notorious “one third” rule.

And it’s Athena, Micheal.  I realize it may be hard for unenhanced mortals to write and pronounce it correctly, but then what’s a superior IQ for?  As for the Summit, I never wanted to participate.  In my case, it would be like an astronomer appearing in an astrologers’ convention.

I thought the gender war was a thing of the 80s.

I tend to be gender blind. To me, the type of genitals a person has is irrelevant in most situations, actually in all situations except one.

Michael should have written your name correctly—but also your astrology comment does not sound nice.

There were many “things of the 80s” that became unfashionable, Giulio, when Bush et al took the helm and brought the world back to the brink of medievalism.  Why not take this discussion up with Barack Obama?  He, like me, seems to think that women aren’t quite there yet.

You should see what happens when reviews are made truly gender-blind.  Examples: orchestra players auditioning behind a curtain; grant research plans, fiction submissions and university admissions judged with names removed.  I’ve had direct experience of the latter three, by the way, not just statistical knowledge.

First outcome: The number of women immediately skyrockets, giving the lie to arguments that “there just weren’t enough qualified female candidates”.  At Harvard, the student ratio went from 1:7 to 1:3 within ONE year.

Second outcome: Comfortably ensconced insiders get either genuinely furious (“I don’t give my money to Harvard so that it can spend it on a bunch of girls”) or pseudo-concerned (“Implementation of affirmative action could diminish the institution’s reputation” or “Such concerns are passé—we must focus on real meritocracy.”).

Bottom line: we may muddle through as a species while treating women and other Others as less than fully human—after all, we have so far.  But we will never truly thrive as long as we do.  As for me, hope springs eternal, as you can see from my snachismo essay.

Michael, I would hope that your lens through which you view transhumanism has evolved since your youth.

@Athena: then, blind reviews are the way to go.

I have been in many selection boards and, as far as I can remember, the m/f ratio of the candidates that I have recommended has been more or less equal to the m/f ration of total applicants. If for a given position there are 5 male and 5 female applicants, I think I would my recommendation would be m or f with equal probability.

I don’t like AA though. That _does_ hurt women. Of course you know that whenever a competent woman is selected for a management post over male applicants, thay will say that she only got there because of AA.

In most cases there are more male applicants than female. Perhaps the problem is not “enough qualified female candidates” but “enough female candidates”?

Look, Anissimov, if the way in which you have defined your “movement” means that you have excluded more than half of the human race (i.e., both women and people of color), then maybe, just maybe, you ought to redefine the boundaries that constitute your movement rather than whinging that there are no qualified women who can speak to what you see as the future of humankind. Because otherwise, you look ridiculous.

As for the supposed ageism, I see if emanating from you, not from people like James Hughes. It seems important to point out here that most 50-year-olds have had life experiences, including raising families, that give them important and nuanced views on issues, that most 20-year-olds tend to lack. Fifty-year-olds can look back at having been you once, and have figured out the difference between what you know and what you only think you know.

To Jamie: Indeed!

To Giulio:

“I don’t like AA though. That _does_ hurt women. Of course you know that whenever a competent woman is selected for a management post over male applicants, thay will say that she only got there because of AA.”

Oh, yes?  Have you heard of legacy admissions or insider hires?  I don’t see the white male recipients of that largesse being embarassed or hobbled, even though they only got the positions because their dads had the money or pulled strings.

And then we have Peter Thiel, a participant in this Summit (maybe it should be called The Cavity?) who in a recent essay declared the following:

“The 1920s were the last decade in American history during which one could be genuinely optimistic about politics. Since 1920, the vast increase in welfare beneficiaries and the extension of the franchise to women - two constituencies that are notoriously tough for libertarians - have rendered the notion of “capitalist democracy” into an oxymoron.”

If attitudes are still this primitive (and dumb, despite the purported high IQ), we’re not even in the 1980s—we’re in the 1450s and you’re strengthening my point about how far we have to go.

Hard-wired social attitudes hurt women, not affirmative action.  It’s imperfect, of course, and can be abused like any other rule or institution.  But if not for it, most of the world would still resemble Wahhabi Saudi Arabia—as witness the words and actions of people who profess to be “progressive” thinkers of “high intelligence”.

And if we believe the self-righteous bromide of “high qualification criteria”, it’s unclear what a manager of a failing hedge fund has to contribute to a self-proclaimed futurist conference in terms of original thinking. 

I have an idea that will take care of all these issues!  Given the general mindset of most of the participants in the Singularity Pit, why not invite Sarah Palin? She’s as qualified as several of them, her views on the future largely jibe with theirs, she’s all woman (no passé crap like feminism for her) and she’s just become available.  But make sure to get hold of her early, before her fee exceeds Kurzweil’s.

Come to think of it, the SIAI et al should go for Anne Coulter.  She’d fit the agenda and milieu even better.  See?  No shortage of qualified women!

I am qualified to speak on the future, both on technological change and the affects of technological change on the human species.

I am qualified to discuss issues which affect the singularity and are affected by the singularity.

I am qualified to discuss human enhancement and a strong voice on human enhancement and the set of sciences and technologies which will be meaningful to the singularity.

I am qualified to argue why the singularity, as proposed by Ray and others, needs a clarifying lens, which would better situate it within the broader context of humanity.

I am qualified to speak on issues that are resulting from the proposed singularity which form a gap, causing a disciplinary disconnect.  That bridge is technological aesthetics.

If you would like to see my credentials, I will gladly supply them to this forum.

I was an Advisor to the Singularity University and I doubt I would have been included if my skills were not at least in some way beneficial to the educational track.

And lastly, my work was mentioned and referenced in Kurzweil’s book, _The Singularity is Near_, (reference shows 3 pages).

But maybe I have a different viewpoint on the singularity than many people, or the Singularity Institute, but so does Aubrey de Grey, and he is here with us in Arizona at a conference and we are speaking in the same track.

Giulio asked:

“I think we should make a difference between “uncritical” and “enthusiastic”, it is not the same thing.”

Enthusiasm does not have to lack critical thinking, but can certainly have blinders to red flags which would warn a person to stop and check his/her/its game plan and provisions.

According to the singularity enthusiasts, future forecasting, as a discipline within which futurists exist, does employ critical thinking.  But this strain of critical thinking is not based in philosophy or necessarily “truth-seeking” as a doctrine.  It is more a social science where the critical thinking is a softer science and given some poetic license because it is tethered to “preferred futures”.  Mind you, that is not one absolute future, but a set of possible futures.  Here the onus is on forecasting as an art, so to speak, and that forecasting includes different dynamics and variables, which shift.  For the singularity, the focus is on a major paradigm shift and it is unsaid how or when it will happen, leaving the door ajar, which is has to be or it would not be a future event.

Thus, the singularity is comprised of advocates who are enthusiastic, but also critical.  But there is a strain of enthusiastic who lack a search for “truth”.  This is a major distinction between futurology and philosophy.  Not everyone wants to see the hard facts but is more interested in the process of future studies and the outcomes.

How do people dig through the material to find what makes a successful future.  What characteristics cause a project to fail?  What would cause the singularity to fail?  Why is this important to the singularity?  Why not focus on stepping back and what are good reasons to doubt it?

Often futurists are excited about analyzing the future and those unexcited about the future may be looking at the time clock ticking and this further propels the excitement.

But I would not say that the singularity enthusiasts are not critical thinkers, just perhaps a little softer side of critical thinking.

Wow, now you smart critics of singularitarians have progressed to presenting insights on our physical appearance:

“they’ll look more pathetic at 50 than they already look at 20.”—Athena Andreadis

Keep up the good work! I hope you reach your goal of attracting “true intellectuals” like the above individual to your camp instead of the singularitarian camp.

To Aleksei:

If your comprehension is so poor that you think my statement referred to physical appearance, I suspect that you don’t understand any of the jargon that you parrot, either.

This is an entirely sensible piece about the technological singularity but isn’t it self-defeating to apply common sense to the prospect of an event that, were it to occur, would entirely transform the basis of that sense?

Mike: Giulio, I can think of at least two good reasons to pay attention to gender diversity (and the same reasons apply to cultural and ethnic diversity)...

Of course. This is part of what I mean by gender-blind and color-blind. I (try to) pay attention to all persons with something interesting to say.

Including white males.

You and I are both white males, and I don’t think we are the worse persons in the world. Certainly we are not the best, but also not the worse. And our point of view is also valid as the point of view of others.

It looks like my extensive post was somehow lost, but I will try my best to replicate it:

Mike A’s argument that “we are seeking a female speaker, but it’s a simple fact that there aren’t that many female researchers in the fields of artificial general intelligence or decision theory” is flawed and/or disturbing in several respects:

First, several of the speakers at the conference are not in the field of AGI or decision theory: Gary Wolf, Robin Hanson, William Dickens. I’m sure they are all great speakers, but what about Donna Haraway? Cynthia Breazel? Martine Rothblatt?  They could have added as much to the dialogue as Wolf, Hanson, or Dickens.

Secondly, why wouldn’t have Mike A asked here at IEET for some recommendations for speakers? I certainly name several women “with the proper credentials and original ideas”, who are excited and positive about the Singularity.  And so, thirdly, the comment that the field is male-dominated, and “that isn’t our fault” smacks of Larry Summers’ comment where suggested that women’s under-representation in the top levels of academia is due to a “different availability of aptitude at the high end.”

Fourthly and finally, I seem to remember that previous Singularity Summits were more balanced in this respect—so what happened this time? Similarly, I was rather dismayed that the faculty at the Singularity University only had one woman and she was one of Ray’s business partners who stood to make a bundle off the tuition. 

Now having said all that, I’m still planning on attending.  However, this issue needs to be more carefully considered, otherwise the Singularitarians risk losing the support of half the species.

More like eighty percent of the species, considering not only the gender but also the color of the speakers.

Yes, white men’s opinions are (potentially) as valid as anyone else’s—unless they are the sole opinion allowed expression while pretending to be inclusive, as is patently the case here.

Also, let me point out that the percentage of SIAI employees that are female (33%) exceeds the percentage of females that are IEET fellows.  Our Summer intern program is also led by a female researcher, Anna Salamon.

Athena: I agree, though I am not sure about “is patently the case here”.

More comments (posted also to your blog):

http://cosmi2le.com/index.php/site/women_and_biologists_are_not_infallible/

Michael, I value you as a friend and colleague but let me state that an employee is quite different than a Fellow or invited speaker.

Most of these types of organizations have women who are employees (usually administrators, bookkeepers, etc. while the men are in the positions of authority).

Extropy Institute and Foresight are the only organizations that showed a sense of inclusiveness when women are concerned.  WTA/H+ has been far more inclusive of women as well.

Yet, the Singularity lags behind in regards to valuing transdisciplinary and multidisciplinary rigor, vision of the future and including the voices of social, environemtnal, economic, political, theoretical, aesthetic domains.

Now, granted one gentlemen is speaking at the Summit. He is talking about aesthetics.

Aesthetics?  Does that relate to social values and philosophyy?  Yes indeed.  And the arts of robotics, immersive design, electronics, digital venues, cyborgs, the transhuman and human enhancement?  Yes.

YOUR COMMENT Login or Register to post a comment.

Next entry: RIP Walter Cronkite

Previous entry: The Desktop Manufacturing Revolution