“Blue Gold.” Water is becoming dangerously rare and valuable in drought-stricken areas around the globe, including my home in California.
Today citizens in developed nations each wastefully splash away 100s of gallons per day. But what if fresh H2O continues to dwindle? Suppose humans were rationed a meager allotment, like 10, or 5, or even 2 gallons per day?
Authors Peter H. Diamandis and Steve Kotler have created just about the perfect handbook when it comes to envisioning a technically advanced, democratic and thriving society. Written in 2012, this book is still an important read for anyone who’s interested in a technical future where humanity finally rises above the mire it has been tethered to for millennia.
Skeptics of renewables sometimes cite data from EIA (The US Department of Energy’s Energy Information Administration) or from the IEA (the OECD’s International Energy Agency). The IEA has a long history of underestimating solar and wind that I think is starting to be understood.
Excitement is building for the New Horizons Mission and its hurried swing past Pluto on July 14. What a terrific way to celebrate Bastille Day! Watch this terrific video - Fast and Light to Pluto - about New Horizons, created by the NY Times.
One integral part of the design we in the Earth Organisation for Sustainability envision is that humanity needs to utilize information technology in order to establish a better overview of the resource flows that we use on the planet, as well as the planet’s own capacity. More of this can be read in the article “The Three Criteria” on this blog.
We are living in a world with many challenges and even existential risks. Yet only a relatively small number of people seem to be concerned about this, while others apparently oblivious behave adversely towards these challenges, e.g. through an environmentally unfriendly lifestyle, in developing as well as developed countries. Very often the reason for this behaviour is not lack of education, but wrong education. In many places children are neither educated properly in sciences, nor are their rationality skills trained. Instead in many parts of the world, the curriculum is linked to unscientific ideologies, which pupils are prone to believe forever if indoctrinated in early childhood.
At some point technology will allow us to live forever. With billionaires spending millions on research  and huge corporations such as Google getting in on the act, very soon we are likely to see rapid advances in life expectancy – with the ultimate aim of radical life extension. All diseases will be cured, and the cellular aging that leads to the deterioration in body and mind will be slowed and eventually reversed so that everybody can choose how long they want to live for.
By creating any form of AI we must copy from biology. The argument goes as follows. A brain is a biological product. And so must be then its products such as perception, insight, inference, logic, mathematics, etc. By creating AI we inevitably tap into something that biology has already invented on its own. It follows thus that the more we want the AI system to be similar to a human—e.g., to get a better grade on the Turing test—the more we need to copy the biology.
I remember once while on a trip to Arizona asking a long-time resident of Phoenix why anyone would want to live in such a godforsaken place. I wasn’t at all fooled by the green lawns and the swimming pools and knew that we were standing in the middle of a desert over the bones of the Hohokam Indians whose civilization had shriveled up under the brutality of the Sonora sun. The person I was speaking to had a quick retort to my east coast skepticism.
Algorithms increasingly guide our daily life: Google’s ranking algorithm pretty much decides which pages we visit, and therefore which information we access; Amazon’s algorithm influences which books we read; dating algorithms decide your sexual life and possibly your marriage; the smartphone’s navigation algorithm decides which streets we take; Yelp’s algorithm decides where we eat (and it is a simple average!)
Four years ago I wrote a trio of essays that generated a barrage of hate mail. The feedback I received wasn’t 100% venomous, but it was more than 50% negative, with one essay getting a thumbs-down 80% of the time.
Last Thursday was launch day for Pope Francis’s historic anticapitalist revolution, a multitargeted global revolution against out-of-control free-market capitalism driven by consumerism, against destruction of the planet’s environment, climate and natural resources for personal profits and against the greediest science deniers.
Stanford professor Paul Ehrlich has been studying extinction for decades; he published Extinction: The Causes and Consequences of Disappearing Species in 1981. Since that time Ehrlich has seen numbers that indicate the rate of extinction - of vertebrates, including mammals - is increasing.
There’s a graph making rounds lately showing the comparative EROIs of different electricity production methods. (EROI is Energy Return On Investment – how much energy we get back if we spend 1 unit of energy. For solar this means – how much more energy does a solar panel generate in its lifetime than is used to create it?)
This EROI graph that is making the rounds is being used to claim that solar and wind can’t support an industrialized society like ours.
In the wake of news that scientists in China modified the DNA of human embryos, a number of scientists and bioethicists have called for a global moratorium on experiments that could alter the human germline. The White House has come out in support of such a ban — for now.
Anyone who has the scientific tenacity to question “common truths” and come to a valid conclusion outside of the confines of popular opinion are destined to be heralded as someone working in the pocket of some agency. Conspiracy theories run amok throughout society, believing any large corporation to be intrinsically “evil”. One corporation in particular stands out the most: Monsanto!
A worry that is not yet on the scientific or cultural agenda is neural data privacy rights. Not even biometric data privacy rights are in purview yet which is surprising given the personal data streams that are amassing from quantified self-tracking activities. There are several reasons why neural data privacy rights could become an important concern.
As William Gibson always reminds us the real role of science-fiction isn’t so much to predict the future as to astound us with the future’s possible weirdness. It almost never happens that science-fiction writers get core or essential features of this future weirdness right, and when they do, according to Gibson, it’s almost entirely by accident. Nevertheless, someone writing about the future can sometimes, and even deliberately, play the role of Old Testament prophet, seeing some danger to which the rest of us are oblivious and guess at traps and dangers into which we later fall. (Though let’s not forget about the predictions of opportunity.)
Frank Herbert’s Dune certainly wasn’t intended to predict the future, but he was certainly trying to give us a warning.
Sometimes, if you want to see something in the present clearly it’s best to go back to its origins. This is especially true when dealing with some monumental historical change, a phase transition from one stage to the next. The reason I think this is helpful is that those lucky enough to live at the beginning of such events have no historical or cultural baggage to obscure their forward view. When you live in the middle, or at the end of an era, you find yourself surrounded, sometimes suffocated, by all the good and bad that has come as a result. As a consequence, understanding the true contours of your surroundings or ultimate destination is almost impossible, your nose is stuck to the glass.
Question is, are we ourselves in the beginning of such an era, in the middle, or at an end? How would we even know?
The final frontier of digital technology is integrating into your own brain. DARPA wants to go there. Scientists want to go there. Entrepreneurs want to go there. And increasingly, it looks like it’s possible.
You’ve probably read bits and pieces about brain implants and prostheses. Let me give you the big picture.
Read on if you don’t mind some relatively minor spoilers about Avengers: Age of Ultron. I won’t be giving away any big surprises or major plot points, but nor can I promise to reveal nothing new about the content of the movie. So, the choice is yours!
For the first time in U.S. history, a supreme court has granted a writ of habeas corpus on behalf of two lab chimpanzees, effectively recognizing them as legal persons. While the future of the chimps has not yet been decided, it’s a huge step forward in establishing personhood status for highly sapient animals.
Recently the journal Nature published a paper arguing that the year in which the Anthropocene, the proposed geological era in which the collective actions of the human species started to trump other natural processes in terms of their impact, began in the year 1610 AD. If that year leaves you, like it did me, scratching your head and wondering what your missed while you dozed off in your 10th grade history class, don’t worry, because 1610 is a year in which nothing much happened at all. In fact, that’s why the author’s chose it.
Recently, I tuned in to watch a 60 Minutes television story on a experimental cancer treatment being tested that was being hailed as near miraculous. As I saw the face of one white patient after another white patient who was cured by injecting the polio virus into a brain tumor, I started to wonder: where are all the black people? Or Hispanics or Asians? It brought to mind the popular campaign and twitter hashtag, Black Lives Matter
Many readers here have no doubt spent at least some time thinking about the Singularity, whether in a spirit of hope or fear, or perhaps more reasonably some admixture of both. For my part, though, I am much less worried about a coming Singularity than I am about a Sofalarity in which our ability to create realistic illusions of achievement and adventure convinces the majority of humans that reality isn’t really worth all the trouble after all. Let me run through the evidence of an approaching Sofalarity. I hope you’re sitting down… well… actually I hope you’re not.
Every book on torture that i have browsed is mainly devoted to methods of torture and then to three topics: Ethical arguments against torture, Utilitarian arguments against torture, and History of the rejection of torture. I cannot find a neuroscientist or psychologist who thought of writing about the exact opposite: What were the ethical justifications for torture?, What were the utilitarian arguments for torture? and What is the history of the widespread adoption of torture?