Objectifying Humans
Daniel J. Neumann
2013-10-14 00:00:00

He postulates that humanity will either a) fulfill the utopian dream of Kurzweil and the other optimistic futurists—what he has entitled the “Heaven Scenario,” b) succumb to the greatest corruption and ill-will that humanity has ever experienced—the “Hell Scenario,” or c) reign technology in as tools of our will, deciding for themselves what their destiny is—something that he termed the “Prevail Scenario.” I’ll examine this seemingly even-handed analysis of the future implications of the curve of the “Singularity” from the perspective of Jacques Ellul.



Ellul believed that technique—the efficient methodology of any craft—has become autonomous from individuals. Classical technology, then, is fundamentally different than modern technology. In the classic age, humans designed tools to augment their technique. The device didn’t have a technique designed into it; we supplied the technique. For example, a hunter might sharpen some obsidian to aid in his task of hunting. The blade proves more efficient in yielding food than the former means (of perhaps wrestling and choking an animal with bare hands). Compare this to a modern piece of technology, such as the Google search engine.



The search engine emerged from a series of complex interplays between other modern technologies, such as electricity, computers, and the advent of internet web addresses. The problem to be solved was not merely in finding what you’re looking for, but mostly a human’s inability to navigate the web without an index. Another problem arises from this tango: More Americans, exposed to the external memory of Google for most of their time, skimming the surface of an information ocean, are being diagnosed with Attention Deficit Disorder. A new technology arises to fix the human problem: pharmaceuticals like Adderall™. Someday, if Kurzweil is right, neuroscience and nanotechnology may converge to augment the human brain to receive data from the internet directly.



Perhaps genetic reprograming would make us better suited for the technology. The difference between classical and modern technology should thus become clear: In the modern age, technologies created out of a complex web of forces (not contingent on any one inventor’s will) design humans to augment technique. Humans are seen as impediments, as extra variables for failure. Just look at Google’s work on autonomous vehicles. Society builds technology so that we can build even better technology—better in the sense that it’s more efficient. Therefore, in the modern age, it’s not the technology that changes (its qualities include a constant escalation of capability); it’s the human beings. It’s not humans that are autonomous; it’s the technique that drives society.



Technology has altered from a means to an end. We’re accommodating our mindset for technology. This starves us of our spirituality, divorcing the why from the how—the reason for existing from the reason of thought, the organic from the mechanical—stunting our knowledge base from a generational work-in-progress to a fast-food-style science procurement that stomps on anything not deemed efficient. We’ll begin specializing ourselves to specific tasks, deconstructing, deducting, (and a lot of destroying), without seeing the overlapping associations that make a larger picture.



By forgetting the arts and traditional culture, we lose our humanity. It’s like lobotomizing the collective right-hemisphere of our brains. It compromises our wisdom. For Ellul, technology isn’t truly progressing humans; it’s just perpetuating technology’s reach. It may be limiting our potential as meaning-makers.



“Even if technology is advancing along an exponential curve, that doesn’t mean humans cannot creatively shape the impact on human nature and society in largely unpredictable ways. Technology does not have to determine history” (224). Ellul wouldn’t necessarily disagree with Garreau’s point wholesale. However, despite the fact that technology doesn’t have to determine history, we have chosen for it to do so long ago. At this point in time, we’ve already crossed the threshold of no return, he would argue. The system is firmly set in place. The only thing that could stop this train (so to speak) would be an electromagnetic pulse from a nuclear attack or a geo-magnetic storm from our sun.



“Almost unimaginably good things are happening, including the conquering of disease and poverty, but also an increase in beauty, wisdom, love, truth, and peace” (130). The false assumption here rests on the definition of these terms. What disease is being cured? Cancer and AIDs may be an ailment worth destroying, but what if the disease to be eradicated is an essential component to being human? Garreau may claim that poverty will be conquered, but is poverty only defined by monetary comfort?



What is the meaning intended behind “beauty, wisdom, love, truth, and peace”? If we see through only a technical perspective, beauty may be deconstructed into a neuro-aesthetic principle of symmetry—wisdom, a collection of relevant information—love, the release of serotonin—truth, a binary assessment of absence or presence of predictability—mechanizing abstraction and dehumanizing ideas that people fought over. Peace could very well be the efficient control of a population by coercion or drugs. Jacques Ellul would feel compelled to ask these questions of Garreau, believing that technology influences our thinking.



Ellul would see the Hell Scenario as the most probable outcome of Garreau’s examination of technology. “Almost unimaginably bad things are happening, destroying large chunks of the human race or the biosphere, at an accelerating pace… Even in the face of such disasters, no agreement is being reached to slow down or stop the spread of these technologies” (184). The technologies surrounding and sustaining our society have become so complex and intertwined that humans will have no choice but to create new technologies to either fix the problem of old technologies, guard against the uncertain nature of human behavior, and/or maintain the growth of technology to keep our economy (predicated on growth) afloat. It’s no longer our will.



No one is particularly responsible for this loss of human liberty (which leads to the absence of accountability and capability to fix this issue), but modern technique now drives society as much as human beings. Humans are now the tools for actualizing efficiency—according to Ellul, anyway.



References



“Radical Evolution,” by Joel Garreau.



“The Autonomy of Technique,” by Jacques Ellul.