
There has been a long running discussion among technology and society scholars (perhaps mostly in places where national sovereignty feels assured), about the ethics of engaging in defense-related research. Defense has been a common theme in this publication. Previous iterations of this magazine featured scope statements for the IEEE Society on Social Implications of Technology (SSIT). Figure 1 shows scope statements from June 1986 and Summer 2010 which included “Peace Technology” as a bullet point for a 12-point list. Before SSIT published the IEEE Technology and Society Magazine, SSIT was the Committee on the Social Implications of Technology (CSIT) and the magazine was a newsletter. Figure 2a shows the front page of the September 1980 newsletter, titled “Machines Don’t Fail, People Do” which describes the human centered causes of defense failures [1]. Figure 2b shows a different headline, “Defense through Decentralization” [2]. Issues of defense have been integral to our magazine since before it was a magazine.
Figure 1.(a) Scope statement from the IEEE Technology and Society Magazine from June 1986. This shows “Peace Technology” as the last bullet point in a 12-point list. (b) Scope statement from the IEEE Technology and Society Magazine from Summer 2010 which included “Peace Technology” as the 11th bullet point in a 12-point list.
Figure 2. (a) Front page of the September 1980 newsletter, titled “Machines Don’t Fail, People Do” which describes the human centered causes of defense failures. (b) Another headline from the September 1980 newsletter, titled “Defense through Decentralization,” which suggested changes in infrastructure that would make infrastructure targets less attractive to aggressors.
Can you be Ethical and Pursue Defense Research?
Still, the notion that engaging in defense research—at all—is essentially immoral is not an uncommon view. I’ve had colleagues express their misgivings about doing any defense-related work whatsoever, including my own work assessing risks from terrorism. So, the thesis of this column is not that engaging in defense-related research is strictly good—or unproblematic. Instead, I want to pose the question: can you be ethical and pursue defense research?
Individual researchers have their own bright lines. A common boundary is rejecting work on weapons systems. It might seem simpler, and cleaner, to avoid any engagement with defense or research projects related to terrorism or public safety. In doing so, it’s certainly easier for the researcher to avoid the judgment of their peers.
Why do defense research then? Defense expenditures worldwide cost an estimated U.S. $ 2.4 trillion. These costs are highest in the United States by raw numbers (900 billion) but by percentage Ukraine currently leads the list [3], [4]. Ukraine’s ranking for defense spending is a powerful argument as to why defense research is worth the serious efforts of researchers. Beyond ensuring that our national funds are spent well and effectively, safeguarding collective safety and liberty are very persuasive arguments. These arguments—the enormous cost, the need to ensure money is spent well, and the vital need for security and peace in an increasingly unstable world—seem like the most logical motivators for engaging with defense research. Engaging in defense is critical to ensuring national security.
The question at the heart of whether or not engaging with defense research is ethical probably boils down to this: does defense promote peace, or does it promote war?
A common criticism of the defense industry is the view that the enormous cost of defense actually drives more conflict. The profit available for corporations from war and weapons means that oligarchs and corporations push for conflict, and for governments to be more likely to pursue aggressive military agendas [5], [6]. The push for conflict, and the profit seeking motive are more reasons for some researchers to want to eschew military-oriented research altogether. In my view, the prevalence of this pro-profit, pro-conflict motivation makes the need for ethics, legal, social implications (ELSI) researchers to engage in defense-related research even more urgent.
This connects to my final argument in favor of engaging with defense issues. What happens when researchers who could work on defense related issues choose to abstract themselves from that conversation because they find it too divisive, too messy, too immoral? Who is left in the room?
The necessity of national defense and social securitization, and the high proportion of national budgets that are allocated to defense tie into the importance of defense concerns to the IEEE Technology and Society Magazine readership. Defense technology and its deployment is at the heart of the ways that we think about technology and its social implications [7]. This can be at the level of budgetary decision making—how can we best spend funds to promote social good and who is involved in that process? But social-technical concerns also govern military resources. This includes how we manage military resources that have reached their physical lifetimes. Examples include unexploded ordinance and landmines, as well as environmental hazards presented by perchlorate and other contaminants from military operations [8], [9]. Defense spending that promotes technological advancements is also important from the perspective of learning how to manage complex dual-use technologies. Examples like RFID, ARPAnet (which became the modern internet), robotics, laser technology, global position systems (GPS) and many others exemplify the role that defense spending has in developing consumer technologies that eventually spill over into the public sphere.
What if, instead of being strictly unethical, engaging in defense related research was a moral imperative? This essay will, of course, not answer this question. But my hope is that the discussions presented in this special issue related spark some consideration in the minds of readers.
Security in a Critical Moment
I started this piece months ago. I didn’t think I could predict the future, but the future I imagined then, looked much less bleak than it did by the first Tuesday after the first Monday this November. I have friends, family and colleagues who felt sure of the outcome that actually transpired—I felt sure of the opposite. In retrospect, those with more foresight spent much more time listening to people with much different ideologies than their own. Their engagement led them to the accurate prediction of the outcome. Their pessimism—or even realism—protected their hearts from hope. My optimism gave me moments of joy. But neither of these emotional approaches prevented the electoral outcome that has made the world a much less stable place. There’s something here about the agency we have with hope. What I want to suggest is that maintaining peace and preventing war require far more engagement with defense issues by far more members of the public. That public includes us, academics, researchers, and urgently those engaged in ELSI research. Understanding the risks associated with technology and developing appropriate policies and regulation are essential for our collective work: the urgency of that is only intensified for defense issues.
Following the recent U.S. election, the world faces a radically different security reality. The Bulletin of the Atomic Scientists created the doomsday clock as a representation of their collective assessment of the risk of nuclear annihilation. As of this writing the clock has been left at 90 seconds to midnight since January 2023 [10]. Moving the clock closer to midnight reflected an increased risk of annihilation in the presence of mounting global conflicts. The clock stayed the same for January 2024, but with the recent election, the risk of nuclear war seems to have increased. Nuclear war is invoked as both a risk and a deterrent. In 1987 Gaddis [11] coined the term “The long peace” to describe a period after World War II in which limited global conflicts arose. While his use of the term was contested—war still existed—there were fewer conflicts, with fewer deaths across the world. Some attributed this to the presence of nuclear weapons themselves—mutually assured destruction (MAD), made the use of annihilating tools illogical [12]. In this argument, the existence of weapons with the power of annihilation is a driver of peace. The MAD argument as a driver of peace relies, among other things, on strong controls and wise leadership. Restraint in using nuclear weapons has long been a concern for North Korea, and for rogue governments or terrorists who might access these weapons and not exercise sufficient caution or self-control. That this restraint can no longer be relied upon for a major nuclear power is evident and terrifying.
Political Technology
Beyond the theorized impacts that nuclear weapons may have brought about peace, this technology belongs to a class of technological objects that many STS scholars consider to have consistent social impacts. Winner [13] argued that some technological objects have political orientation, and posited that nuclear weapons require hierarchical political order due to their existence. He and others further argued that even nuclear power requires security apparatus to control the diffusion of nuclear material into the public.
Nuclear technology—and defense more generally—is inextricably intertwined with academic research on the technology society relationship. The core policy analysis course in my graduate program in Engineering and Public Policy featured applications in quantitative policy analysis related to the nuclear risk (in the form of the nuclear regulatory commission’s Fault Tree handbook [14]), and presented the set of policy tools developed by Bush [15] that laid the groundwork for science policy in the United States at the end of the second world war. While military contributions to formal, quantitative methods in technology and society are a fundamental component in understanding technology policy, these formal research collaborations are balanced by active resistance to military research. The father of cybernetics embodied both direct weapons research and fierce resistance to military pursuits [16]. Norbert Wiener’s work on anti-aircraft targeting systems during World War II is considered fundamental to the development of cybernetics. However, his commitment to ethics and social responsibility eventually led him to eschew all associations with military-funded research projects. His chosen path represents one possible way to respond, and one that cost him in career and reputation. I understand and respect this choice. But my hope is that people like him—whose expertise blends the technical and the ethical—will continue to be engaged in this field to ensure it remains as ethical as possible.
Good governance requires effective policies, which are based in knowledge and insight, and systems for evaluating what works well and what does not. We see the application of expert knowledge and insight in defense applications reflected within this special issue. The authors who contributed to this special issue provide a framework for setting norms in space, explore the security and ethical implications that can come from repurposing technological objects as weapons, evaluate the social construction of weapons funding choices, and investigate methods for constructing and sharing intelligence information. As these topics show, our authors and readership have immense technical knowledge along with the ability to provide material assistance to creating a well-functioning and ethical government. While I, like you, long for a world that is peaceful and just—that world has not yet arrived. At least for now—and most likely always, ensuring a functional society that effectively deploys its resources requires careful analysis of security and national defense.
The intensified uncertainty and security risks of the present historical moment underscore the importance of expert driven decision making.
If anything, the intensified uncertainty and security risks of the present historical moment underscore the importance of expert driven decision making and provide a regrettable lesson in what happens when experts are removed for essential decision-making processes. It is then (perhaps) fortuitous that this issue exemplifies the value of active, critical engagement in defense-related research.
In this Special Issue
Blake et al. [A1] articulate the needs to develop collective standards for international behavior and cooperation in the context of space and propose an online repository to facilitate information sharing and collaboration and commerce, creating more certainty within the space domain.
Lavazza and Farina [A2] outline the risks to traditional understandings of the proper conduct of war when consumer technologies are repurposed for war and terror.
Zafeirakopoulos [A3] explores the ways that cybersecurity can be safeguarded for national security by using holistic methods for understanding current and future realities. Using a participatory action research method within a workshop format the researchers met in interdisciplinary groups to synthesize the current research to map a way forward in AI cybersecurity. Using both scenarios and “personas,” which are fictional characters, study participants were able to explore both existing security contexts and develop insight into other potential chains of events using counter-factuals. By articulating sense-making as a design practice, broader ways of knowing, including intuition and non-empirical reasoning are given formal roles within this process. Their findings indicate that guided interactions can facilitate information sharing in un-classified contexts. Research like this is critical in developing effective strategies to provide national security. The SSIT community nurtures this type of critical scholarship; related research was presented in the Social Implications of National Security workshop as part of the International Symposium on Technology and Society, sponsored by SSIT [17].
Tracy [A4] explores the role of the “hype narrative” in weapons design, and the ethical obligation of engineers to properly inform policy makers. By focusing the ethical analysis on the duty to inform, then, Tracy moves beyond the discussions of whether or not participation in weapons development itself can be ethical, and instead asks the question of the role of the engineer in accurately assessing the technical capabilities of these technologies. This, to me, is perhaps the most compelling argument on the importance of ethically oriented researchers continuing to engage in questions of defense. Resisting techno-boosterism and evaluating potential performance accurately, and in a range of possible future scenarios is vital to national security, and to the broader ways in which society is collectively run.
All of the articles implicate our narratives and beliefs regarding technology. Indeed, the weaponization of consumer tech bolsters the notion that—perhaps—technology itself is neutral and highlights the role and ethics of the humans and organizations who use these technologies. Tracy puts the role of the engineer (and specifically, the defense engineer) into focus when he articulates the “social construction of hype.” Here, engineers can legitimize hype narratives that are then utilized by national defense agencies, politicians, and the weapons developers. Technology and society expertise is as vital to other aspects of security, from safeguarding against current risks to cybersecurity, to creating an infrastructure for ensuring security in space.
It is exactly the types of analyses shown in this issue that can facilitate better governance, collective responsibility and more ethical behavior. Back to the question that I posed at the beginning “Can you be ethical and pursue defense research?” I answer yes. An important related issue to pursuing defense research is funding sources. Assuming that engaging in defense research can be ethical, what about accepting defense research funding? Can taking money from defense agencies or weapons manufacturers be ethical? This is more fraught, and intersects with bigger questions on the way that research and academia function. It is possible to conduct this type of research without accepting defense funding or having formal connections to weapons makers or other military connected organizations. However, by engaging with defense agencies and purveyors, those of us with formal training in technology and society, and especially in ethics, have the possibility to steer these agencies toward more reflection on their impacts, and possibly to better behavior. This is optimistic, and perhaps naïve. But it is certain that without potential critics, or at least critical scholars, these organizations will receive far less pressure to change.
It is exactly the types of analyses shown in this issue that can facilitate better governance, collective responsibility and more ethical behavior.
Connecting to SSIT’s Past
A sampling of some past defense-related articles published in IEEE Technology and Society Magazine also mirrors the ethical tensions inherent in technology and society research on defense quite well. A 2003/2004 article, “The double-edged sword of secrecy in military weapon development” by Cummings [18] explores the role that political and social factors have in weapons selection. Like Tracy, Cummings evaluates the performance of missile systems, and the ways in which the perception of performance is socially mediated. In “Project Hindsight, TRACEs, and What Structured Case Studies Can Say about Innovation” Pirtle and Moore [19] examined two efforts in the 1960s to investigate the triggers of innovation. Project Hindsight was a DOD project which explored innovation in missile systems. This inspired an NSF project which analyzed the triggers of innovation in five civilian inventions. What causes innovations would seem to be one of the most critical possible areas of study; what struck the authors (and me) as the most shocking is how little this original body of research has been explored, particularly by the technology and society literature. I suspect the reason for this is the same that I articulated at the beginning of this piece.
In “The Human Information Appliance in Combat, Intelligence, and Diplomacy Space,” Burnett [20] highlights the intersecting modes of knowledge that comprise intelligence gathering, and the role that human and non-human systems have and will have in the future of intelligence and diplomacy. This article is a fascinating companion to Zafeirakopoulos’ piece in this issue. In “Autonomous Weapon Systems: Failing the Principle of Discrimination,” Guersenzvaig [21] invokes just war theory to describe the ethical failures that stood between autonomous weapons systems as of 2018 (and which, in my view, remain today).
The questions raised by the IEEE Technology and Society Magazine articles in previous issues and this one are vital, and well deserving of more scholarly attention. Writing this piece has led me back to some of the earliest pieces in the magazine’s history, and given me the opportunity to explore some of those published between then and the ones in this special issue. What strikes me is that our community has remained a critical location to explore the complex and challenging ELSI issues that impact societal choices for defense. While the technologies change, the essential issues that shape our technological choices have not.
I hope that this editorial and the content in this defense-themed issue can open a conversation among IEEE Technology and Society Magazine readers on the possibility of conducting defense research while remaining ethical.
ACKNOWLEDGMENTS
The inspiration for this piece came from discussions I had at IEEE Ethics 2023, along with many exchanges I have had with peers since I began ELSI-related research. I am indebted to Heather Love for inviting me to participate in the Norbert Wiener and the Future of Work workshop at IEEE Ethics 2023, as well as to the other speakers at both events for their engaging presentations and discussions. I am also indebted to Katina Michael for discussion, insight and comments related to this piece over the last year, and to Terri Bookman for sharing SSIT lore related to this piece. Katina and Pamela Tudge provided scholarly insights, and, most valuably, suggested counterarguments. All of the above-mentioned read and commented on early versions of this piece and made it stronger, provided editing and challenging questions. The pieces in this special issue could not have been peer reviewed without the sustained efforts of associate editors. While these people all made important contributions, the views expressed here are strictly my own.
Appendix: Related Articles
- D. Blake et al., “Emerging space norms: A crucial initiative for global security,” IEEE Technol. Soc. Mag.,vol. 43, no. 4, pp. 64–67, Dec. 2024, doi: 10.1109/MTS.2024.3497139.
- A. Lavazza and M. Farina, “The costs and perils of weaponizing consumer technologies (the 2024 pager and walkie-talkie explosions in Lebanon and Syria),” IEEE Technol. Soc. Mag., vol. 43, no. 4, pp. 68–71, Dec. 2024, doi: 10.1109/MTS.2024.3489656.
- M. Zafeirakopoulos, “Sensemaking national security: Applying design practice to explore AI in cybersecurity,” IEEE Technol. Soc. Mag., vol. 43, no. 4, pp. 72–82, Dec. 2024, doi: 10.1109/MTS.2024.3457679.
- C. L. Tracy, “Weapons design, engineering ethics, and the duty to inform: A case study on U.S. hypersonic missile development,” IEEE Technol. Soc. Mag., vol. 43, no. 4, pp. 83–95, Dec. 2024, doi: 10.1109/MTS.2024.3434518.
Author Information
Ketra Schmitt is an associate professor at the Centre for Engineering and Society and an associate member at the Concordia Institute for Information Systems Engineering, Gina Cody School of Engineering and Computer Science, Concordia University, Montreal, QC H3G 1M8, Canada. She is the Editor-in-Chief of IEEE Technology and Society Magazine and serves as a board member for the IEEE Society for the Social Implications of Technology. Email: ketra.schmitt@concordia.ca.
______
To view this full article, including references, click HERE.
_______