Application Progamming Interfaces Are Not a Substitute for Ethics
Jamais Cascio
2009-09-11 00:00:00
URL



I've been doing quite a bit of work on the impacts of the emerging tools allowing us to manipulate our perceptions of the world (e.g., neurotechnology), our physical environment (e.g., geoengineering), and the building blocks of the material world itself (e.g., synthetic biology and molecular nanotechnology). There's a theme that recurs across all of these arenas: what happens when someone does something careless or malicious with the technology? It's bad enough when the technology in question is an automobile or computer network. These emerging disciplines fall into a category I sometimes call "catalytic innovations," and one characteristic is that the worst-case misuse scenarios can be truly terrifying.


For some, the knee-jerk response is a desire to prohibit the development of these technologies. As appealing as that might sound, it suffers from a fundamental flaw: these technologies do not require a massive industrial base, so surreptitious development would be far harder to detect than (say) nuclear weapons development. Ultimately, the only way to enforce the ban would likely be with constant, unrelenting, global surveillance. Few of us, even those afraid of the potential of these catalytic technologies, would be willing to take that path.

A more nuanced response, and one that I see frequently from the proponents of these various technologies, is that well-designed systems could make catastrophic misuse difficult, even impossible. A synthetic biology lab-in-a-box, for example, might be pre-programmed with a variety of forbidden combinations of bio-components, perhaps with limiting and tracking components built into every synthbio design. A molecular nanofactory could have similar restrictions. Whatever the system, if there's a programming interface, there's the potential for automatic limits on output.

This is a manifestation of a philosophy I see quite often online across a wide array of subjects, that of "tools, not rules" -- don't try to get people to change their behavior, alter systems to shape the results of their behavior.

But this model does little to prevent misbehavior arising from novel approaches, nor from abuses that fall within the system rules, but are still harmful. . .

READ THE REST