Relinquishment is the idea that the dangers of Emerging technologies are so great and unmanageable that those technologies must be abandoned, or relinquished. The term was popularized by Bill Joy’s famous 2000 Wired article"Why the Future Doesn’t Need Us,” which argued that the dangers of technologies such as NBIC were so great that the only solution is to stop using them.
KurzweilAI.net defines relinquishment as"To yield or give way to an opposing idea. For example, modern Luddites argue that the best way to prevent possible negative outcomes of future technological development is by relinquishment of all development of new technology in the fields where threats may arise.”
While technoprogressives recognize the risks of technology and the need for their Regulation, even if relinquishment were desirable it is not considered a feasible policy that could actually be implemented.
The chances of all the countries in the world relinquishing technologies that provide substantial benefits that often include quality of life, survival, and substantial military advantage are essentially nil. The world’s governments cannot even effectively coordinate a transition from a carbon-based economy to combat Global warming, let alone relinquish entire fields of technology.
There are many other arguments against relinquishment, including its unethical and totalitarian nature, as well as the increased dangers of partial relinquishment. If policies were implemented to officially ban certain fields of emerging technologies, development of those technologies would simply go underground without Regulation and with more use by terrorists, rogue nations, or other malicious actors.
Furthermore, even if relinquishment were remotely feasible and safe, the risks and benefits of NBIC technologies must also be weighed against the horrific conditions currently endured by much of humanity. As Max More notes,“Billions of people continue to suffer illness, damage, starvation, and all the plethora of woes humanity has had to endure through the ages.”
In arguing against relinquishment, Ray Kurzweil considers the dangers we face everyday from risks such as having enough nuclear warheads to destroy all mammalian life. He points out that, if asked, most people from a few centuries ago would find facing such risks a ludicrous exchange for improvement to life, but that very few of us today would seriously consider the trade-off of casting aside our current technologies in exchange for the shortened and greatly-reduced quality life of a few centuries ago.
Although relinquishment is often thought of as a relinquishment of all Emerging technologies, it could be more limited in scope. For instance, a particular field or sub-field that was deemed particularly dangerous could be relinquished, but even these somewhat more limited implementation are inadvisable. However, even optimists such as Kurzweil support guidelines for technological relinquishment limited to a specific, such as the Foresight Institute guidelines for nanotechnology. The Foresight Institute guidelines call for the"relinquishment” of physical entities capable of self-replication in a natural environment, and banning of self-replicators that contain their own self-replication codes as opposed to storing them on a secure centralized server.