Does predictability provide an overriding concept and perhaps a metric for evaluating when LAWS are acceptable or when they might be unacceptable under international humanitarian law? Arguably, if the behavior of an autonomous weapon is predictable, deploying it might be considered no different from, for example, launching a ballistic missile. This, of course, presumes that we can know how predictable the behavior of a specific autonomous weapon will be.
IEET co-founder Nick Bostrom, IEET Fellow Wendell Wallach and Affiliate Scholar Seth Baum are Principal Investigators on projects n funded by Elon Musk and the Open Philanthropy Project and administered by the Future of Life Institute.
IEET Fellow Wendell Wallach recently co-published an article in the National Academy of Sciences‘ ISSUES in Science and Technology journal, with ASU law professor Gary E. Marchant, The piece is entitled Coordinating Technology Governance and it explores the need for, and application of, a nimble authoritative coordinating body, referred to as a Governance Coordination Committee, to fill an urgent gap with regard to the assessment of the ethical, legal, social and economic consequences of emerging technologies.
“The Terminator” is clearly science fiction, but it speaks to a deep intuition that the robotization of warfare is a slippery slope—the endpoint of which can neither be predicted nor fully controlled. Two reports released soon after the November 2012 election have propelled the issue of autonomous killing machines onto the political radar.
Day Two of the Moral Brain conference at New York University, co-sponsored by the IEET, is largely devoted to a review of the last ten years of research on the neuroscience of moral sentiments and decision-making, with talks by Paul Bloom among others.
In January, IEET Executive Director J. Hughes and IEET Fellow Wendell Wallach met with representatives of the Japanese Consortium on Applied Neuroscience (Japanese, English). They visited Trinity College as part of a national tour to meet with American neuroethicists.
Robots with even limited sensitivity to ethical considerations and the ability to factor those considerations into their choices and actions will open up new markets. However, if robots fail to adequately accommodate human laws and values in their behaviour, there will be demands for regulations that limit their use. Over the next twenty years, advances in robotics will converge with neurotechnologies and other emerging technologies. We will be confronted with not just monitoring and managing individual technologies that are each developing rapidly, but also with the cultural transformations arising from the convergence of many technologies. Technological development can overheat or may even stagnate. The central role for ethics, law, and public policy in the development of robots and neurotechnologies will be in modulating their rate of development and deployment. Compromising safety, appropriate use, and responsibility is a ready formulation for inviting crises in which technology is complicit.
Wendell Wallach, a lecturer and consultant at Yale University’s Interdisciplinary Center for Bioethics—and recently appointed as an IEET Fellow—has emerged as one of the leading voices on technology and ethics.