So I finally got around to reading Max Tegmark’s book Our Mathematical Universe, and while the book answered the question that had led me to read it, namely, how one might reconcile Plato’s idea of eternal mathematical forms with the concept of multiple universes, it also threw up a whole host of new questions. This beautifully written and thought provoking book made me wonder about the future of science and the scientific method, the limits to human knowledge, and the scientific, philosophical and moral meaning of various ideas of the multiverse.
If you get just old enough, one of the lessons living through history throws you is that dreams take a long time to die. Depending on how you date it, communism took anywhere from 74 to 143 years to pass into the dustbin of history, though some might say it is still kicking. The Ptolemaic model of the universe lasted from 100 AD into the 1600′s. Perhaps even more dreams than not simply refuse to die, they hang on like ghost, or ghouls, zombies or vampires, or whatever freakish version of the undead suits your fancy. Naming them would take up more room than I can post, and would no doubt start one too many arguments, all of our lists being different. Here, I just want to make an argument for the inclusion of one dream on our list of zombies knowing full well the dream I’ll declare dead will have its defenders.
Over the spring the Fundamental Questions Institute (FQXi) sponsored an essay contest the topic of which should be dear to this audience’s heart- How Should Humanity Steer the Future? I thought I’d share some of the essays I found most interesting, but there are lots, lots, more to check out if you’re into thinking about the future or physics, which I am guessing you might be.
Prophecies of doom, especially when they’re particularly frightening, have a way of sticking with us in a way more rosy scenarios never seem to do. We seem to be wired this way by evolution, and for good reason. It’s the lions that almost ate you that you need to remember, not the ones you were lucky enough not to see. Our negative bias is something we need to be aware of, and where it seems called for, lean against, but that doesn’t mean we should dismiss and ignore every chicken little as a false prophet even when his predictions turn out to be wrong, not just once, but multiple times. For we can never really discount completely the prospect that chicken little was right after all, and it just took the sky a long, long time to fall.
If you wish to understand the future you need to understand the city, for the human future is an overwhelmingly urban future. The city may have always been synonymous with civilization, but the rise of urban humanity has been something that has almost all occurred after the onset of the industrial revolution. In 1800 a mere 3 percent of humanity lived in cities of over one million people. By 2050, 75 percent of humanity will be urbanized. India alone might have 6 cities with a population of over 10 million.
The bold gamble of the Brazilian neuroscientist Miguel Nicolelis to have a paralysied person using an exoskeleton controlled by the brain kick a soccer ball during the World Cup opening ceremony has paid off. Yet, for how important the research of The Walk Again Project is to those suffering paralysis, the less than 2 second display of the technology did very little to live up to the pre-game media hype.
For anyone thinking about the future relationship between nature-man-machines I’d like to make the case for the inclusion of an insightful piece of fiction to the canon. All of us have heard of H.G. Wells, Isaac Asimov or Arthur C. Clarke. And many, though perhaps fewer, of us have likely heard of fiction authors from the other side of the nature/technology fence, writers like Mary Shelley, or Ursula Le Guin, or nowadays, Paolo Bacigalupi, but certainly almost none of us have heard of Samuel Butler, or better, read his most famous novel Erewhon (pronounced with 3 short syllables E-re-Whon.)
When I was around nine years old I got a robot for Christmas. I still remember calling my best friend Eric to let him know I’d hit pay dirt. My “Verbot” was to be my own personal R2D2. As was clear from the picture on the box, which I again remember as clear as if it were yesterday, Verbot would bring me drinks and snacks from the kitchen on command- no more pestering my sisters who responded with their damned claims of autonomy! Verbot would learn to recognize my voice and might help me with the math homework I hated.
A few weeks back I wrote a post on how the recent discovery of gravitational lensing provided evidence for inflationary models of the Big Bang. These are cosmological models that imply some version of the multiverse, essentially the idea that ours is just one of a series of universes, a tiny bubble, or region, of a much, much larger universe where perhaps even the laws of physics or rationality of mathematics differed from one region to another.
Human beings are weird. At least, that is, when comparing ourselves to our animal cousins. We’re weird in terms of our use of language, our creation and use of symbolic art and mathematics, our extensive use of tools. We’re also weird in terms of our morality, and engage in strange behaviors visa-via one another that are almost impossible to find throughout the rest of the animal world.
Has human evolution and progress been propelled by war? The question is not an easy one to ask, not least because war is not merely one of the worst but arguably the worst thing human beings inflict on one another comprising murder, collective theft, and, almost everywhere but in the professional militaries of Western powers, and only quite recently, mass, and sometimes systematic rape.
It seems almost as long as we could speak human beings have been arguing over what, if anything, makes us different from other living creatures. Mark Pagel’s recent book Wired for Culture: The Origins of the Human Social Mind is just the latest incantation of this millennia old debate, and as it has always been, the answers he comes up with have implications for our relationship with our fellow animals, and, above all, our relationship with one another, even if Pagel doesn’t draw many such implications.
Big news this year for those interested in big questions, or potentially big news, as long as the findings hold up. Scientists at the Harvard-Smithsonian Center for Astrophysics may have come as close as we have ever ever to seeing the beginning of time in our universe. They may have breached our former boundary in peering backwards into the depth of time, beyond the Cosmic Microwave Background, the light echo of the Big Bang- taking us within an intimate distance of the very breath of creation.
Everyone alive today owes their life to a man most of us have never heard of, and that I didn’t even know existed until last week. On September, 26 1983, just past mid-night, Soviet lieutenant colonel Stanislav Petrov was alerted by his satellite early warning system that an attack from an American ICBM was underway. Normal protocol should have resulted in Petrov giving the order to fire Russian missiles at the US in response.
"I believe that we have turned a corner: we have finally attained Peak Indifference to Surveillance. We have reached the moment after which the number of people who give a damn about their privacy will only increase. The number of people who are so unaware of their privilege or blind to their risk that they think “nothing to hide/nothing to fear” is a viable way to run a civilization will only decline from here on in." - Cory Doctorow
FaceBook turns ten this year, yes only ten, which means if the company were a person she wouldn’t even remember when Friends was a hit TV show- a reference meant to jolt anyone over 24 with the recognition of just how new the whole transparency culture, which FaceBook is the poster child for, is. Nothing so young can be considered a permanent addition to the human condition, but mere epiphenomenon, like the fads and fashions we foolishly embraced, a mullet and tidied jeans, we have now left behind, lost in the haze of the stupidities and mistakes in judgement of our youth.
My daughters and I just finished Carlo Collodi’s 1883 classic Pinocchio our copy beautifully illustrated by Robert Ingpen. I assume most adults when they picture the story have the 1944 Disney movie in mind and associate the name with noses growing from lies and Jiminy Cricket. The Disney movie is dark enough as films for children go, but the book is even darker, with Pinocchio killing his cricket conscience in the first few pages. For our poor little marionette it’s all downhill from there.
For most of our days and for most of the time we live in the world of Daniel Kahneman’s experiencing self. What we pay attention to is whatever is right in front of us, which can range from the pain of hunger to the boredom of cubicle walls. Nature has probably wired us this way, the stone age hunter and gatherer still in our heads, where the failure to focus on the task at hand came with the risk of death. A good deal of modern society, and especially contemporary technology such as smart phones, leverages this presentness and leaves us trapped in its muck, a reality Douglas Rushkoff brilliantly lays out in hisPresent Shock.
The last decade or so has seen a renaissance is the idea that human beings are something far short of rational creatures. Here are just a few prominent examples: there was Nassim Taleb with his The Black Swan, published before the onset of the financial crisis, which presented Wall Street traders caught in the grip of their optimistic narrative fallacies, that led them to “dance” their way right over a cliff. There was the work of Philip Tetlock which proved that the advice of most so-called experts was about as accurate as chimps throwing darts. There were explorations into how hard-wired our ideological biases are with work such as that of Jonathan Haidt in his The Righteous Mind.
Last week the prime minister of Ukraine, Mykola Azarov, resigned under pressure from a series of intense riots that had spread from Kiev to the rest of the country. Photographs from the riots in The Atlantic blew my mind, like something out of a dystopian steampunk flic. Many of the rioters were dressed in gas masks that looked as if they had been salvaged from World War I. As weapons they wielded homemade swords, molotov cocktails, and fireworks. To protect their heads some wore kitchen pots and spaghetti strainers.
The problem I see with Nicolelis’ view of the future of neuroscience, which I discussed last time, is not that I find it unlikely that a good deal of his optimistic predictions will someday come to pass, it is that he spends no time at all talking about the darker potential of such technology.
I first came across Miguel Nicolelis in an article for the MIT Technology Review entitledThe Brain is not computable: A leading neuroscientist says Kurzweil’s Singularity isn’t going to happen. Instead, humans will assimilate machines. That got my attention. Nicolelis, if you haven’t already heard of him, is one of the world’s top researchers in building brain-computer interfaces. He is the mind behind the project to have a paraplegic using a brain controlled exoskeleton make the first kick in the 2014 World Cup. An event that takes place in Nicolelis’ native Brazil.
Lately, there have be weird mumblings about secession coming from an unexpected corner. We’ve come to expect that there are hangers on to the fallen Confederate States of America, or Texans hankering after their lost independent Republic, but Silicon Valley? Really? The idea, at least at first blush, seems absurd.
Religions, because they in part contain Mankind’s longest reflections on human nature tend to capture this tragic condition of ultimately destructive competition between sentient beings with differing desires and wills, a condition which we may find are not only possessed by our fellow animals, but may be part of our legacy to any sentient machines that are our creations as well. Original sin indeed!
For people in cold climes, winter, with its short days and hibernation inducing frigidity, is a season to let one’s pessimistic imagination roam. It may be overly deterministic, but I often wonder whether those who live in climates that do not vary with the seasons, so that they live where it is almost always warm and sunny, or always cold and grim, experience less often over the course of a year the full spectrum of human sentiments and end up being either too utopian for reality to justify, or too dystopian for those lucky enough to be here and have a world to complain about in the first place.
Back what now itself seems a millennium ago, when I was a senior in high school and freshman in college, I used to go to yard sales. I wasn’t looking for knickknacks or used appliances, but for cheap music and mostly for books. If memory serves me you could usually get a paperback for 50 cents, four of them for a dollar, and a hard cover for a buck.
Whatever little I retain from my Catholic upbringing, the short days of the winter and the Christmas season always seem to turn my thoughts to spiritual matters and the search for deeper meanings. It may be a cliche, but if you let it hit you, the winter and coming of the new year can’t help but remind you endings, and sometimes even the penultimate ending of death. After all, the whole world seems dead now, frozen like some morgue-corpse, although this one, if past is prelude, really will rise from the dead with the coming of spring.
However interesting a work it is, Eric Schmidt and Jared Cohen’s The New Digital Age is one of those books where if you come to it as a blank slate you’ll walk away from it with a very distorted chalk drawing of what the world actually looks like. Above all, you’ll walk away with the idea that intrusive and questionable surveillance was something those other guys did, the bad guys, not the American government, or US corporations, and certainly not Google where Schmidt sits as executive chairman.