Soapbox Post

In the years immediately following World War II, a debate raged among U.S. policy officials over whether to place nuclear weapons – and the technological production systems that made them possible – in the hands of the military. They decided no, instead establishing the Atomic Energy Commission as a civilian nuclear weapons agency. Their goal: to ensure democratic control over the production and use of this most dangerous form of technology.

 

I wonder, today, whether the United States ought to ask the same question about technologies of human enhancement. Should the production and use of human enhancement technologies for military purposes be placed in the hands of a civilian agency?

 

The basic argument supporting a yes answer to this question is relatively simple. The militarization of human enhancement technologies will potentially be extremely dangerous – to individuals and to democracy – in part because military organizations are hierarchical and secretive and in part because battlefield performance is arguably the most competitive, coercive, and destructive context on the planet. The most enhanced soldiers are likely to be put on the deadliest missions (indeed, they already are). Unanticipated flaws in or consequences of the design of enhancement technologies may create glaring vulnerabilities in conflict settings. Yet, the history of militaries all over the globe demonstrates that neither the welfare of individual soldiers nor the long-term societal consequences of the use of particular technologies long survive efforts to achieve greater military effectiveness. Witness, for example, recent debates over torture, landmines, and depleted uranium.

 

The U.S. military clearly believes that both physical and cognitive enhancement of its soldiers hold significant military value. As Jonathan Moreno details in Mind Wars, military research agencies like the Defense Advanced Research Projects Agency have spent untold billions on efforts to enhance human physiological and neurological characteristics in the search of the next generation of military advantage in the name of continued U.S. national security. See, also, the 2008 National Research Council report, Emerging Cognitive Neuroscience and Related Technologies, commissioned by the Defense Department’s Defense Intelligence Agency.

 

It is hard to deny the potential military value of enhancements – although if we arrive at the point where we can protect our national security only through transforming ourselves into high-tech battle machines, I will count it a serious failure of humanism and the human race. Nonetheless, I would suggest several reasons to take decisionmaking about human enhancement research and application to soldiers out of the hands of the military.

 

First, soldiers are not just instruments, and the military has a poor record of allowing its personnel to opt out of dangerous experiments. They also own a culture of machismo not unlike that which makes it so difficult to protect NFL players from the long-term dangers of concussions, as well as a hierarchical command structure that brooks little in the way of disobeying orders. Human rights demand stronger protections for soldiers in the realm of human enhancement than may be strictly possible within military organizations.

 

Second, the long-term, civilian implications of human enhancement are too great to simply be ignored in the development of human enhancement for military purposes. Beyond such blunt arguments as the fear of creating super-humans who would out compete their civilian counterparts lies a more subtle threat to democracy from human enhancement. The concept of achieved merit lies at the core of modern liberal forms of democratic governance, as John Carson has so brilliantly demonstrated in his comparative history of intelligence testing in the U.S. and France. Yet that idea is potentially fatally undermined by human enhancement technologies.

 

Decisions about how to proceed with human enhancement – if at all – must be made democratically, with full recognition of the uncertainties that surround these technologies and the consequences for both individuals and the arguably increasingly fragile political underpinnings of democratic societies. That can only happen if those decisions are made explicitly, by democratic institutions, and not by default by military commanders looking for the next battlefield edge.

 

 

About the Author:  Clark Miller is associate director of CSPO and associate professor of science policy and political science.
Comments
Clark Miller
Jan 31, 2010 @ 11:06am
These are valuable points, both. Thanks. I am working with Michael Burnham-Fink on a longer piece about this issue (see his reply at his blog: http://wealoneonearth.blogspot.com/). Clearly both problems will need to be addressed in any effort to meet these challenges in the future.
richard m. o'meara
Dec 22, 2009 @ 4:40am
I take your point and yet I think we are well beyond the military/civilian distinction that existed post WW11. We saw this in the debat over enhansed interrogation techniques. The military was the primary restraining bureacracy in 2005/06 and yet the civilian bureaucracies (CIA etc.) continued unrestrained. I like the idea of one agency watching over these emergent technoligies but it has to apply to all parts of gov and where possible civilian centers of innovation as well. rom
Frank Laird
Dec 19, 2009 @ 9:33am
Clark Miller’s analogy for civilian regulation of human enhancement technologies is flawed. The AEC did not really control of nuclear weapons. The agency never said “no” to any military request. In addition, the AEC paid for the weapons, so the military did not even have the financial constraint of having to trade off getting those weapons versus some other expense. The AEC itself was set up to be minimally response to democratic institutions, with fixed-term commissioners and its own Congressional committee, the Joint Committee on Atomic Energy. The AEC was at least partly insulated from the wishes of the President, as long as it had its Congressional protectors, the role that the JCAE saw for itself. The AEC not only pushed the full-on development of nuclear weapons, it also retaliated against insiders who dissented from any of its programs, as it did to J. Robert Oppenheimer when he resisted the development of the H-bomb. I don't see any example of restraint there.
Sorry! Comments have been automatically turned off for this post. Comments are automatically turned off 360 days after being published.
 


Privacy Policy . Copyright 2013 . Arizona State University
Consortium for Science, Policy & Outcomes
College of Liberal Arts and Sciences
PO Box 875603, Tempe AZ 85287-5603, Phone: 480-727-8787, Fax: 480-727-8791
cspo@asu.edu