FaceBook turns ten this year, yes only ten, which means if the company were a person she wouldn’t even remember when Friends was a hit TV show- a reference meant to jolt anyone over 24 with the recognition of just how new the whole transparency culture, which FaceBook is the poster child for, is. Nothing so young can be considered a permanent addition to the human condition, but mere epiphenomenon, like the fads and fashions we foolishly embraced, a mullet and tidied jeans, we have now left behind, lost in the haze of the stupidities and mistakes in judgement of our youth.
My daughters and I just finished Carlo Collodi’s 1883 classic Pinocchio our copy beautifully illustrated by Robert Ingpen. I assume most adults when they picture the story have the 1944 Disney movie in mind and associate the name with noses growing from lies and Jiminy Cricket. The Disney movie is dark enough as films for children go, but the book is even darker, with Pinocchio killing his cricket conscience in the first few pages. For our poor little marionette it’s all downhill from there.
For most of our days and for most of the time we live in the world of Daniel Kahneman’s experiencing self. What we pay attention to is whatever is right in front of us, which can range from the pain of hunger to the boredom of cubicle walls. Nature has probably wired us this way, the stone age hunter and gatherer still in our heads, where the failure to focus on the task at hand came with the risk of death. A good deal of modern society, and especially contemporary technology such as smart phones, leverages this presentness and leaves us trapped in its muck, a reality Douglas Rushkoff brilliantly lays out in hisPresent Shock.
The last decade or so has seen a renaissance is the idea that human beings are something far short of rational creatures. Here are just a few prominent examples: there was Nassim Taleb with his The Black Swan, published before the onset of the financial crisis, which presented Wall Street traders caught in the grip of their optimistic narrative fallacies, that led them to “dance” their way right over a cliff. There was the work of Philip Tetlock which proved that the advice of most so-called experts was about as accurate as chimps throwing darts. There were explorations into how hard-wired our ideological biases are with work such as that of Jonathan Haidt in his The Righteous Mind.
Last week the prime minister of Ukraine, Mykola Azarov, resigned under pressure from a series of intense riots that had spread from Kiev to the rest of the country. Photographs from the riots in The Atlantic blew my mind, like something out of a dystopian steampunk flic. Many of the rioters were dressed in gas masks that looked as if they had been salvaged from World War I. As weapons they wielded homemade swords, molotov cocktails, and fireworks. To protect their heads some wore kitchen pots and spaghetti strainers.
The problem I see with Nicolelis’ view of the future of neuroscience, which I discussed last time, is not that I find it unlikely that a good deal of his optimistic predictions will someday come to pass, it is that he spends no time at all talking about the darker potential of such technology.
I first came across Miguel Nicolelis in an article for the MIT Technology Review entitledThe Brain is not computable: A leading neuroscientist says Kurzweil’s Singularity isn’t going to happen. Instead, humans will assimilate machines. That got my attention. Nicolelis, if you haven’t already heard of him, is one of the world’s top researchers in building brain-computer interfaces. He is the mind behind the project to have a paraplegic using a brain controlled exoskeleton make the first kick in the 2014 World Cup. An event that takes place in Nicolelis’ native Brazil.
Lately, there have be weird mumblings about secession coming from an unexpected corner. We’ve come to expect that there are hangers on to the fallen Confederate States of America, or Texans hankering after their lost independent Republic, but Silicon Valley? Really? The idea, at least at first blush, seems absurd.
Religions, because they in part contain Mankind’s longest reflections on human nature tend to capture this tragic condition of ultimately destructive competition between sentient beings with differing desires and wills, a condition which we may find are not only possessed by our fellow animals, but may be part of our legacy to any sentient machines that are our creations as well. Original sin indeed!
For people in cold climes, winter, with its short days and hibernation inducing frigidity, is a season to let one’s pessimistic imagination roam. It may be overly deterministic, but I often wonder whether those who live in climates that do not vary with the seasons, so that they live where it is almost always warm and sunny, or always cold and grim, experience less often over the course of a year the full spectrum of human sentiments and end up being either too utopian for reality to justify, or too dystopian for those lucky enough to be here and have a world to complain about in the first place.
Back what now itself seems a millennium ago, when I was a senior in high school and freshman in college, I used to go to yard sales. I wasn’t looking for knickknacks or used appliances, but for cheap music and mostly for books. If memory serves me you could usually get a paperback for 50 cents, four of them for a dollar, and a hard cover for a buck.
Whatever little I retain from my Catholic upbringing, the short days of the winter and the Christmas season always seem to turn my thoughts to spiritual matters and the search for deeper meanings. It may be a cliche, but if you let it hit you, the winter and coming of the new year can’t help but remind you endings, and sometimes even the penultimate ending of death. After all, the whole world seems dead now, frozen like some morgue-corpse, although this one, if past is prelude, really will rise from the dead with the coming of spring.
However interesting a work it is, Eric Schmidt and Jared Cohen’s The New Digital Age is one of those books where if you come to it as a blank slate you’ll walk away from it with a very distorted chalk drawing of what the world actually looks like. Above all, you’ll walk away with the idea that intrusive and questionable surveillance was something those other guys did, the bad guys, not the American government, or US corporations, and certainly not Google where Schmidt sits as executive chairman.
If we look back to the early days when the Internet was first exploding into public consciousness, in the 1980’s, and even more so in the boom years of the 90’s, what we often find is a kind of utopian sentiment around this new form of “space”. It wasn’t only that a whole new plane of human interaction seemed to be unfolding into existence almost overnight, it was that “cyberspace” seemed poised to swallow the real world- a prospect which some viewed with hopeful anticipation and others with doom.
There has been some ink spilt lately at the IEET over a new movement that goes by the Tolkienesque name, I kid you not, of the dark enlightenment, also called neo-reactionaries. Khannea Suntzu has looked at the movement from the standpoint of American collapse and David Brin within the context of a rising oligarchic neo-feudalism.
If someone on the street stopped and asked you what you thought was the meaning behind Oscar Wilde’s novel A Picture of Dorian Grey you’d probably blurt out, like the rest of us, that it had something to do with a frightening portrait, the dangers of pursuing immortality, and, if one remembered vague details about Wilde’s life, you might bring up the fact that it must have gotten him into a lot of trouble on account of its homoeroticism.
Last week was my oldest daughter’s 5th birthday in my mind the next “big” birthday after the always special year one. I decided on a geology themed day one of whose components were her, me and my younger daughter who’s 3 taking a trip to a local limestone cave that holds walk through tours.
When our most precious and hard fought for successes give rise to yet more challenges life is revealing its Sisyphean character. We work as hard as we can to roll a rock up a hill only to have it crush us on the way down. The stones that threatens us this time are two of our global civilization’s greatest successes- the fact that children born are now very likely to live into old age and the fact that we have stretched out this old age itself so that many, many more people are living into ages where in the past the vast majority of their peers would be dead. These two demographic revolutions when combined form the basis of what I am calling the Longevity Crisis. Let’s take infant mortality first.
Percy’s epic poem, Prometheus Unbound is seldom read today while his wife’s novel, Frankenstein; or the Modern Prometheus has become so well known that her monster graces the boxes of children’s cereal, and became the fodder from one of the funniest movies of the 20th century.
It is interesting at least to wonder what the scientific revolution would have looked like had it occurred somewhere other than in the West. What latent goals and assumptions might the systematic and empirical study of nature have had if it had arisen somewhere in what were at the time more technologically and scientifically advanced civilizations: in the lands of Islam, in Confucian-Daoist-Buddhist China, in the Hindu lands of southern India?
Rebecca Rosen over at the Atlantic has a fascinating recent article about how the MIT Media Lab is using science-fiction to help technologists think through the process of design. Not merely to think up new gadgets, but to think iteratively and consciously about the technologies they are creating to try and prevent negative implications from occurring before a technology is up and running. A fascinating idea that get us beyond the endless dichotomy of those who call for relinquishment and those urging, risks be damned, full-steam ahead.
What especially distinguishes human beings from other animals has been the degree to which they seek out and invent ways to leverage the basics of their biology to reach ever more complex levels of thought and action. Early human beings leveraged their fragile and limited bodies with tools including fire, leveraged their own natural psychology using naturally occurring drugs and religious rituals and used music to obtain a more emotional connection with one another and the world.
There have been glowing reviews at the IEET of Zoltan Istvan’sThe Transhumanist Wager. This will not be one of those. As I will argue, if you care about core transhumanist concerns, such as research into pushing out the limits of human mortality, little could be worse than the publication of Istvan’s novel. To put it sharply in terms of his so-called First Law of Transhumanism “A transhumanist must safeguard his own existence above all else”; Istvan, by creating a work that manages to disparage and threaten nearly every human community on earth has likely shortened the length of your life
What the current crisis in and over Syria makes painfully clear is the extent to which the international system, the way in which global affairs have been organized since at least the 19th century when it became possible to view the various human communities scattered across the landscape of the earth as part of one-world is failing. The system is failing whatever the outcome of current debates in the UN and US over military strikes against Syria.
I finally had the chance to see Elysium this week. As films go, the picture is certainly visually gripping and the fight scenes awesome, if you are into that sort of thing. But, in terms of a film about ideas the picture left me scratching my head, and I could only get a clue as to the film’s meaning as intended by Neill Blomkamp, Elysium’s screenwriter and director, by looking elsewhere.
The IEET is pleased to appoint contributor Rick Searle as an Affiliate Scholar. Rick is a writer and educator living the very non-technological Amish country of central Pennsylvania along with his two young daughters.
If we take what amounts to the very long view of the matter it’s quite easy to see how both the tradition of human rights and transhumanism emerge from what are in effect two different Christian emphasises on the life of Christ. Of course, this is to look at things from the perspective of the West alone. One can easily find harbingers of both human rights and transhumanism outside of Christianity and the West in Non-Western societies and religious/philosophical traditions in Islam, Hinduism, Buddhism or Taoism among others.
How is this for a bold statement: the ultimate morality or immorality of transhumanism rests with the position it will take on the question of human rights and more specifically its adoption or denial of the principles of one document little discussed outside of the circle of international lawyers and human rights activists: The Universal Declaration of Right of 1948.
Something I think we are prone to forget in this age of chattering heads and two-bit pundits is that ideas have consequences. Anyone engaged in public discourse has some responsibility to wrestle with the ethical implications of their thought, and this is as much the case for the Rush Limbaugh’s of the world as it is for that disappearing class of thinkers who once proudly went by the name intellectuals. In a way, artists have always had an easier time than those engaged in more discursive lines of thought.
Human beings have a very limited attention span, a fact amplified a thousand fold by modern media. It seems the “news” can consist of a only handful of widely followed stories at a time, and only one truly complex narrative. This is a shame because the recent breaking of one substantial news story was followed by the breaking of another one which knocked it out of the field of our electronic tunnel vision. Without some narrative connecting the two only one can really hold our attention at a time. Neither of these stories have to do with Kate Middleton and the birth of Prince George.