Widening Divides, or Bridging Them
Mike Treder
2009-05-16 00:00:00

Consider the implications of this next stage of human brain development. Would it widen the gulf between the world's haves and have-nots -- and perhaps even lead to a distinct and dominant species with unmatchable powers of intellect?


This comes from a recent and much-discussed article in NewScientist titled "Will designer brains divide humanity?"

One view is that this is merely the next phase in a process that has been taking place throughout human history. Humans have always played an active role in improving their own brainpower, says Lambros Malafouris of the McDonald Institute for Archaeological Research in Cambridge, UK, who was one of the organisers of the Berlin meeting. It began with inherited gene mutations that gave us uniquely "plastic" brains, capable of changing physically to meet hitherto unassailable intellectual and practical challenges.

More recent changes have been moulded through our interactions with the physical environment, and by the socially created "memes" passed down through culture. Milestones in human brain improvements over the past 2 million years have included the invention of gestures and language to describe our thoughts to others, as well as the written word and our ability to commit everything to permanent records. . .

Today, our minds are even more fluid and open to enhancement due to what Merlin Donald of Queens University in Kingston, Ontario, Canada, calls "superplasticity", the ability of each mind to plug into the minds and experiences of countless others through culture or technology. "I'm not saying it's a 'group mind', as each mind is sealed," he says. "But cognition can be distributed, embedded in a huge cultural system, and technology has produced a huge multiplier effect." In other words, humans already have minds evolving beyond anything seen before in history.


And what should we look forward to next?

The next stage of brainpower enhancement could be technological -- through genetic engineering or brain prostheses. Because the gene variants pivotal to intellectual brilliance have yet to be discovered, boosting brainpower by altering genes may still be some way off, or even impossible. Prostheses are much closer, especially as the technology for wiring brains into computers is already being tested. . .

It won't be long before "clip-on" computer aids become available for everybody, says Andy Clark, a pro-enhancement philosopher at the University of Edinburgh in the UK. These could be anything from memory aids to the ability to "search" for information stored in your brain. "We'll get a flowering of brain augmentations, some seeping through from the disabled community," he says. "I see them becoming fashion items, a bit like choosing clothing."

Malafouris also believes such augmentation is the next logical stage in human development. "If we accept that tool use was part of the reason we came to develop language, then why should we perceive neuro-engineering as a threat rather than as the new stone industry of the 21st century?"


Of course, it's not all peaches and cream:

Not everyone thinks this is a good idea, however. Dieter Birnbacher, a philosopher at the University of Düsseldorf in Germany, says there are risks in technological self-improvement that could jeopardise human dignity. One potential problem arises from altering what we consider to be "normal": the dangers are similar to the social pressure to conform to idealised forms of beauty, physique or sporting ability that we see today.

People without enhancement could come to see themselves as failures, have lower self-esteem or even be discriminated against by those whose brains have been enhanced, Birnbacher says. He stops short of saying that enhancement could "split" the human race, pointing out that society already tolerates huge inequity in access to existing enhancement tools such as books and education.

The perception that some people are giving themselves an unfair advantage over everyone else by "enhancing" their brains would be socially divisive, says John Dupré at the University of Exeter, UK. "Anyone can read to their kids or play them music, but put a piece of software in their heads, and that's seen as unfair," he says. As Dupré sees it, the possibility of two completely different human species eventually developing is "a legitimate worry".


I find it quite interesting that this new worry should be expressed and debated so actively within the same week that scholars around the world are commemorating the 50th anniversary of C.P. Snow's famous lecture on "The Two Cultures." Here is what my friend and colleague Andrew Maynard says about it on his blog, 2020 Science:

Fifty years ago, long before Richard Dawkins coined the term “meme,” the British scientist, public figure and novelist Charles Percy Snow planted an idea into the collective consciousness that has since grown to have a profound influence on science and the arts in Western society. Sadly, it wasn’t the idea he necessarily wanted to plant. So while the relevance of Snow’s “two cultures”—representing the divide between the scientific and literary elite of the day—has been debated and deconstructed ad infinitum over the intervening decades, Snow’s real passion—tackling material poverty through science and technology—has largely been ignored…

In 1963, Snow wrote a follow-on piece to the 1959 lecture. In “Two cultures: A second look” C.P. Snow addressed the concerns of his many critics. But he also took the opportunity to clarify and expand on what he was trying to convey four years earlier. Freed from the constraints of crafting a short and somewhat simple public lecture, he wrote compellingly on science’s place in society, and the absolute necessity of using it for the social good—something he only saw the cultural divides around him obstructing.


Andrew goes on to quote Snow, who wrote:

We cannot know as much as we should about the social conditions all over the world. But we can know, we do know, two most important things. First we can meet the harsh facts of the flesh, on the level where all of us are, or should be, one. We know that the vast majority, perhaps two-thirds, of our fellow men are living in the immediate presence of illness and premature death; their expectation of life is half of ours, most are under-nourished, many are near to starving, many starve. Each of these lives is afflicted by suffering, different from that which is intrinsic in the individual condition. But this suffering is unnecessary and can be lifted. This is the second important thing which we know—or, if we don’t know it, there is no excuse or absolution for us. . .

We cannot avoid the realization that applied science has made it possible to remove unnecessary suffering from a billion individual human lives—to remove suffering of a kind, which, in our own privileged society, we have largely forgotten, suffering so elementary that it is not genteel to mention it.


In conclusion, Andrew says:

Fifty years on, a lot has changed. Approaches to education are different. There is extensive and productive cross-talk between the science and the arts. And national and global cultures have evolved. Yet the central problem Snow faced remains: we live in a world divided into the rich and the poor; where the majority of people don’t have access to necessary material needs—food, water, shelter, medical treatment; where science and technology are increasingly able to bridge this divide, if only they were used effectively. The unfortunate irony is that, by using the two cultures as a light to illuminate the problems facing society, Snow ended up creating a smokescreen that has, if anything, helped to obscure them.

The reality is that Snow’s 1959 lecture and 1963 essay are even more relevant now than they were 50 years ago—not because of the culture issues they address, but because in a society that is increasingly dependent on science and technology, we still haven’t got a good grasp on how to use them to make life better for the poor as well as the rich.


To drive the point home, consider the inequities described in this essay on "The Real Perils of Human Population Growth":

About forty years ago, the world population was only 3.5 billion, or about half of the present population of 6.7 billion people. Most of us seem to ignore or be unaware of the magnitude of this rapid expansion and the vast changes that it is causing throughout the world. Indeed, the daily and even the annual impacts of this growth go unnoticed. Yet the impacts of the growing world population on land, water, energy, and biota resources are real and indeed overwhelming. . .

According to the World Health Organization, nearly 60 percent of the world population now is malnourished—the largest number and proportion of malnourished people ever reported in history. Further, many serious diseases, like malaria, HIV/AIDS, and tuberculosis are increasing, not only because of worldwide malnutrition but also because the increasing density and movement of human populations facilitate the spread of diseases.


It's a long article, full of facts and statistics, and it makes for fairly grim reading, but well worth it.

We live in a world today of great inequity. In almost every measure you can name -- food, water, shelter, sanitation, health care, income, education, freedom -- the "rich" (or what we in the West would regard as middle class) are far better off than the poor, which still make up the majority of the world's population.

Snow decried this division of resources and opportunities 50 years ago. He wasn't the first, of course, but he did it with eloquence and integrity. Half a century later, have we made much progress?

As powerful emerging technologies come online in the next half century, how much more progress -- if any -- will we make in bridging the divide between the 'haves' and the 'have-nots'?