The police response to protests and riots in Ferguson, Missouri were filled with images that have become commonplace all over the world in the last decade. Police dressed in once futuristic military gear confronting civilian protesters as if they were a rival army. The uniforms themselves put me in mind of nothing so much as the storm-troopers from Star Wars. I guess that would make the rest of us the rebels.
There’s a condition I’ve noted among former hard-core science-fiction fans that for want of a better word I’ll call future-deflation. The condition consists of an air of disappointment and detachment with the present that emerges on account of the fact that the future one dreamed of in one’s youth has failed to materialize. It was a dream of what the 21st century would entail that was fostered by science-fiction novels, films and television shows, a dream that has not arrived, and will seemingly never arrive- at least within our lifetimes. I think I have a cure for it, or at least a strong preventative.
Human beings seem to have an innate need to predict the future. We’ve read the entrails of animals, thrown bones, tried to use the regularity or lack of it in the night sky as a projection of the future and omen of things to come, along with a thousand others kinds of divination few of us have ever heard of. This need to predict the future makes perfect sense for a creature whose knowledge bias is towards the present and the past. Survival means seeing enough ahead to avoid dangers, so that an animal that could successfully predict what was around the next corner could avoid being eaten or suffering famine.
I just finished a thrilling little book about the first machine war. The author writes of a war set off by a terrorist attack where the very speed of machines being put into action,and the near light speed of telecommunications whipping up public opinion to do something now, drives countries into a world war. In his vision whole new theaters of war, amounting to fourth and fifth dimensions, have been invented. Amid a storm of steel huge hulking machines roam across the landscape and literally shred human beings in their path to pieces. Low flying avions fill the sky taking out individual targets or help calibrate precision attacks from incredible distances beyond. Wireless communications connect soldiers and machine together in a kind of world-net…
Lately, I’ve been enjoying reruns of the relatively new BBC series Sherlock, starring Benedict Cumberbatch, which imagines Arthur Conan Doyle’s famous detective in our 21st century world. The thing I really enjoy about the show is that it’s the first time I can recall that anyone has managed to make Sherlock Holmes funny without at the same time undermining the whole premise of a character whose purely logical style of thinking make him seem more a robot than a human being.
So I finally got around to reading Max Tegmark’s book Our Mathematical Universe, and while the book answered the question that had led me to read it, namely, how one might reconcile Plato’s idea of eternal mathematical forms with the concept of multiple universes, it also threw up a whole host of new questions. This beautifully written and thought provoking book made me wonder about the future of science and the scientific method, the limits to human knowledge, and the scientific, philosophical and moral meaning of various ideas of the multiverse.
If you get just old enough, one of the lessons living through history throws you is that dreams take a long time to die. Depending on how you date it, communism took anywhere from 74 to 143 years to pass into the dustbin of history, though some might say it is still kicking. The Ptolemaic model of the universe lasted from 100 AD into the 1600′s. Perhaps even more dreams than not simply refuse to die, they hang on like ghost, or ghouls, zombies or vampires, or whatever freakish version of the undead suits your fancy. Naming them would take up more room than I can post, and would no doubt start one too many arguments, all of our lists being different. Here, I just want to make an argument for the inclusion of one dream on our list of zombies knowing full well the dream I’ll declare dead will have its defenders.
Over the spring the Fundamental Questions Institute (FQXi) sponsored an essay contest the topic of which should be dear to this audience’s heart- How Should Humanity Steer the Future? I thought I’d share some of the essays I found most interesting, but there are lots, lots, more to check out if you’re into thinking about the future or physics, which I am guessing you might be.
Prophecies of doom, especially when they’re particularly frightening, have a way of sticking with us in a way more rosy scenarios never seem to do. We seem to be wired this way by evolution, and for good reason. It’s the lions that almost ate you that you need to remember, not the ones you were lucky enough not to see. Our negative bias is something we need to be aware of, and where it seems called for, lean against, but that doesn’t mean we should dismiss and ignore every chicken little as a false prophet even when his predictions turn out to be wrong, not just once, but multiple times. For we can never really discount completely the prospect that chicken little was right after all, and it just took the sky a long, long time to fall.
If you wish to understand the future you need to understand the city, for the human future is an overwhelmingly urban future. The city may have always been synonymous with civilization, but the rise of urban humanity has been something that has almost all occurred after the onset of the industrial revolution. In 1800 a mere 3 percent of humanity lived in cities of over one million people. By 2050, 75 percent of humanity will be urbanized. India alone might have 6 cities with a population of over 10 million.
The bold gamble of the Brazilian neuroscientist Miguel Nicolelis to have a paralysied person using an exoskeleton controlled by the brain kick a soccer ball during the World Cup opening ceremony has paid off. Yet, for how important the research of The Walk Again Project is to those suffering paralysis, the less than 2 second display of the technology did very little to live up to the pre-game media hype.
For anyone thinking about the future relationship between nature-man-machines I’d like to make the case for the inclusion of an insightful piece of fiction to the canon. All of us have heard of H.G. Wells, Isaac Asimov or Arthur C. Clarke. And many, though perhaps fewer, of us have likely heard of fiction authors from the other side of the nature/technology fence, writers like Mary Shelley, or Ursula Le Guin, or nowadays, Paolo Bacigalupi, but certainly almost none of us have heard of Samuel Butler, or better, read his most famous novel Erewhon (pronounced with 3 short syllables E-re-Whon.)
When I was around nine years old I got a robot for Christmas. I still remember calling my best friend Eric to let him know I’d hit pay dirt. My “Verbot” was to be my own personal R2D2. As was clear from the picture on the box, which I again remember as clear as if it were yesterday, Verbot would bring me drinks and snacks from the kitchen on command- no more pestering my sisters who responded with their damned claims of autonomy! Verbot would learn to recognize my voice and might help me with the math homework I hated.
A few weeks back I wrote a post on how the recent discovery of gravitational lensing provided evidence for inflationary models of the Big Bang. These are cosmological models that imply some version of the multiverse, essentially the idea that ours is just one of a series of universes, a tiny bubble, or region, of a much, much larger universe where perhaps even the laws of physics or rationality of mathematics differed from one region to another.
Human beings are weird. At least, that is, when comparing ourselves to our animal cousins. We’re weird in terms of our use of language, our creation and use of symbolic art and mathematics, our extensive use of tools. We’re also weird in terms of our morality, and engage in strange behaviors visa-via one another that are almost impossible to find throughout the rest of the animal world.
Has human evolution and progress been propelled by war? The question is not an easy one to ask, not least because war is not merely one of the worst but arguably the worst thing human beings inflict on one another comprising murder, collective theft, and, almost everywhere but in the professional militaries of Western powers, and only quite recently, mass, and sometimes systematic rape.
It seems almost as long as we could speak human beings have been arguing over what, if anything, makes us different from other living creatures. Mark Pagel’s recent book Wired for Culture: The Origins of the Human Social Mind is just the latest incantation of this millennia old debate, and as it has always been, the answers he comes up with have implications for our relationship with our fellow animals, and, above all, our relationship with one another, even if Pagel doesn’t draw many such implications.
Big news this year for those interested in big questions, or potentially big news, as long as the findings hold up. Scientists at the Harvard-Smithsonian Center for Astrophysics may have come as close as we have ever ever to seeing the beginning of time in our universe. They may have breached our former boundary in peering backwards into the depth of time, beyond the Cosmic Microwave Background, the light echo of the Big Bang- taking us within an intimate distance of the very breath of creation.
Everyone alive today owes their life to a man most of us have never heard of, and that I didn’t even know existed until last week. On September, 26 1983, just past mid-night, Soviet lieutenant colonel Stanislav Petrov was alerted by his satellite early warning system that an attack from an American ICBM was underway. Normal protocol should have resulted in Petrov giving the order to fire Russian missiles at the US in response.
"I believe that we have turned a corner: we have finally attained Peak Indifference to Surveillance. We have reached the moment after which the number of people who give a damn about their privacy will only increase. The number of people who are so unaware of their privilege or blind to their risk that they think “nothing to hide/nothing to fear” is a viable way to run a civilization will only decline from here on in." - Cory Doctorow
FaceBook turns ten this year, yes only ten, which means if the company were a person she wouldn’t even remember when Friends was a hit TV show- a reference meant to jolt anyone over 24 with the recognition of just how new the whole transparency culture, which FaceBook is the poster child for, is. Nothing so young can be considered a permanent addition to the human condition, but mere epiphenomenon, like the fads and fashions we foolishly embraced, a mullet and tidied jeans, we have now left behind, lost in the haze of the stupidities and mistakes in judgement of our youth.
My daughters and I just finished Carlo Collodi’s 1883 classic Pinocchio our copy beautifully illustrated by Robert Ingpen. I assume most adults when they picture the story have the 1944 Disney movie in mind and associate the name with noses growing from lies and Jiminy Cricket. The Disney movie is dark enough as films for children go, but the book is even darker, with Pinocchio killing his cricket conscience in the first few pages. For our poor little marionette it’s all downhill from there.
For most of our days and for most of the time we live in the world of Daniel Kahneman’s experiencing self. What we pay attention to is whatever is right in front of us, which can range from the pain of hunger to the boredom of cubicle walls. Nature has probably wired us this way, the stone age hunter and gatherer still in our heads, where the failure to focus on the task at hand came with the risk of death. A good deal of modern society, and especially contemporary technology such as smart phones, leverages this presentness and leaves us trapped in its muck, a reality Douglas Rushkoff brilliantly lays out in hisPresent Shock.
The last decade or so has seen a renaissance is the idea that human beings are something far short of rational creatures. Here are just a few prominent examples: there was Nassim Taleb with his The Black Swan, published before the onset of the financial crisis, which presented Wall Street traders caught in the grip of their optimistic narrative fallacies, that led them to “dance” their way right over a cliff. There was the work of Philip Tetlock which proved that the advice of most so-called experts was about as accurate as chimps throwing darts. There were explorations into how hard-wired our ideological biases are with work such as that of Jonathan Haidt in his The Righteous Mind.
Last week the prime minister of Ukraine, Mykola Azarov, resigned under pressure from a series of intense riots that had spread from Kiev to the rest of the country. Photographs from the riots in The Atlantic blew my mind, like something out of a dystopian steampunk flic. Many of the rioters were dressed in gas masks that looked as if they had been salvaged from World War I. As weapons they wielded homemade swords, molotov cocktails, and fireworks. To protect their heads some wore kitchen pots and spaghetti strainers.
The problem I see with Nicolelis’ view of the future of neuroscience, which I discussed last time, is not that I find it unlikely that a good deal of his optimistic predictions will someday come to pass, it is that he spends no time at all talking about the darker potential of such technology.
I first came across Miguel Nicolelis in an article for the MIT Technology Review entitledThe Brain is not computable: A leading neuroscientist says Kurzweil’s Singularity isn’t going to happen. Instead, humans will assimilate machines. That got my attention. Nicolelis, if you haven’t already heard of him, is one of the world’s top researchers in building brain-computer interfaces. He is the mind behind the project to have a paraplegic using a brain controlled exoskeleton make the first kick in the 2014 World Cup. An event that takes place in Nicolelis’ native Brazil.
Lately, there have be weird mumblings about secession coming from an unexpected corner. We’ve come to expect that there are hangers on to the fallen Confederate States of America, or Texans hankering after their lost independent Republic, but Silicon Valley? Really? The idea, at least at first blush, seems absurd.