Research to Reality: Scholars as Agents of Change

By on March 10th, 2025 in Articles, Artificial Intelligence (AI), Editorial & Opinion, Environment, Ethics, Human Impacts, Magazine Articles, Social Implications of Technology

Ketra Schmitt and Pamela Tudge

 

It is a truth universally acknowledged that a society in possession of social technical challenges must be in want of scientific evidence–adapted [1].

Jane Austen’s famous opening line may have been about marriage, but we can apply something similar to science and academic inquiry. Like hopeful parents looking for profitable matches for their children, our collective academic dreams of real-world applications of our research are too often disconnected from reality.

A few years ago, this was dispiriting. As of this submission, a better word might be appalling. By the time this column runs in March, it may well be criminal. We are witnessing the spectacle of government and institutional decision-making disconnected from careful analysis, scientific evidence, or the selection of policies, practices, or tools that promote the most social good.

Was public decision-making always this disconnected from academic research and best practice? Was there ever a time in which careful analysis led to decisions that made the best sense for the common good or as close as possible within the strictures of political horse-trading?

Like hopeful parents looking for profitable matches for their children, our collective academic dreams of real-world applications of our research are too often disconnected from reality.

Of course, more scientifically based practices have been practiced over the past decades. However, a striking example of a move away from evidence-based policy was the dissolution of the United States Congress Office of Technology Assessment (OTA). Between 1972 and 1995, this tiny office developed hundreds of scrupulously researched, even-handed reports on topics related to emerging technologies and risks [2]. The cost of running this office was low and much lower compared to the money saved by avoiding poor investments. However, the OTA was forced to shut down when the U.S. Republican party gained a majority in the U.S. Congress in 1995. Shuttering the office—and saving the US $ 20 million annual budget of the OTA—was part of House Speaker Newt Gingrich’s “Contract with America” [3], [4]. Rather than shrinking or fixing government, the loss of the OTA left the U.S. government vulnerable to partisan or otherwise biased efforts that gave the veneer of evidence and research while actually being oriented toward individual gain. After the termination of the OTA, external research was instead sponsored by industry lobbyists with the explicit aim of funding or passing legislation in the best of interest of that industry, not the public.

While the OTA ended in the United States, this office inspired similar agencies around the world. The European Network of Parliamentary Technology Assessment includes 14 agencies as members and 11 more as associates including Argentina and Korea [5].

Within the United States, institutions remain, which arguably fill evidentiary gaps for regulators and lawmakers, such as the Office of Management and Budget (OMB), the Government Accountability Office (GAO), and the Congressional Research Service (CRS). OMB seems to be poised to be weaponized at this moment, but this would not be the first time that OMB has been used to prevent regulatory action or other government intervention.1 Formalized evidentiary practices for policymaking, like instituting cost-benefit analysis for policies, aim to identify policy solutions that promote the highest gain for the lowest dollars. However, those approaches have important limits as well. The Army Corps of Engineers, for example, has a strict cost-benefit rule for implementing flood prevention projects. Specifically, national economic benefits must outweigh national economic costs [7]. This leads to projects being approved due to higher property values since those flood prevention efforts save more expensive property [8]

Scientific-based principles for governance go back further. The tenets and practices for nuclear safety emerged from academic consensus and research. Military thinking on the winnability of nuclear war was profoundly influenced by academic research, including game theory, computer modeling, and collaborations between academic researchers and military leaders that led to live war game simulations [9]. Of course, when Vannevar Bush [11] laid out a strategy for investment and leadership in science education and basic research in “Science, the Endless Frontier,” his words and vision resonated with policymakers and the American people [10].

Our mission at TSM is to understand the social–technical complexities inherent in technological innovation and implementation.

The role of this investment in science and especially science and engineering education is perhaps best summed up in the title of this MIT Science Policy review report; “Federal R&D funding: the bedrock of national innovation” [12]. The benefits of investment in science and technology are enormous [13]. On the federal U.S. level, defense spending led to the development of laser technologies, RFID, global position systems, and the advanced research project network (ARPAnet), which laid the groundwork for the modern Internet. Massive technological investment is not restricted to the public. DeLong [14] argues that industrial research efforts and large amounts of capital led to the development of technologies that materially improved the human condition for many and that this was reflected in exceptional economic growth between 1870 and 2010.

While these technological breakthroughs have been important and transformative, they have not always translated to benefits for the broader public good or policies that help to ensure that the benefits of technology are received by everyone. Indeed, we see an intensification of social and particularly wealth inequality.

Here, we return to the disconnect between our scientific advancements and social outcomes. Scientists and engineers regularly develop excellent technological innovations that could help transform people’s lived experiences, but these technologies are very often not brought to fruition. Yet, it is the vision of improving the public good that draws many, and even most, people into engineering and science, as well as the formal study of the ways that these technologies can be implemented within society.

A common gap in these innovations and frameworks is a misunderstanding or neglect of the social–technical relationship or the broader social context.

This is where the Society for Social Implications of Technology (our parent organization) and its publications, including the IEEE Technology and Society Magazine (TSM), can make essential contributions. Our mission at TSM is to understand the social–technical complexities inherent in technological innovation and implementation. Looking back at recent papers, we see important and impactful research that should translate into much better policies and social outcomes. Recent examples include a December 2024 paper that examined the pager attacks and how they deviated from the typical propagation of war [15]. Our Fall 2024 special issue on technology and analytics for global development explored in one article how AI can safeguard water resources by analyzing ecosystems, tracking water quality, and detecting contamination [16]. What is more, TSM has explored exactly this issue before. For example, a guest editorial as far back as Spring 1997 titled “Technical Expertise in Public Decisions” introduced a set of papers that laid out the tensions between bureaucrats, experts, and the public [17]. Like this article, Andrews opened by noting the increasing tension and disconnect between these parties.

Given the urgency of the climate crisis, wealth and income inequality and affordability, and ongoing global conflicts and genocide, along with the erosion of centuries of democratic norms, it is imperative that academics stand up for their research and its practical applications.

These research findings should, or at least could, result in better implementation of technology, more equity, and better overall outcomes. It should be equally obvious that this does not generally, or even often, happen.

The reality is that research adoption is rarely driven by merit alone; rather, political advantage and the promise of maximum capital returns often determine which innovations gain traction and which are ignored. This is not to argue that good things do not come from sociotechnical research or from science and engineering research in general. However, too often, our sociotechnical work is disconnected from actual policy implementation. Worse yet, we can identify areas in which business fails to take advantage of research that could both improve social outcomes and their bottom line.

This calls into question several of our most cherished assumptions both about the way that decision-making works (logical, based on research and best practices) and the meaning of our own work.

It does not have to be this way. Our governments and other regulatory and collective decision-making bodies can incorporate best practices for research outcomes. However, getting to this result requires much more active engagement and participation in the regulatory and even political process than many academics are comfortable with.

Given the urgency of the climate crisis, wealth and income inequality and affordability, and ongoing global conflicts and genocide, along with the erosion of centuries of democratic norms, it is imperative that academics stand up for their research and its practical applications.

Recent experience indicates that there is no one else who will champion our work or its implementation in the governance process. We need to find new ways to be advocates—not just as researchers, but as engaged public voices. This means forging alliances with activists, communities, and policymakers, amplifying our work beyond academic circles, and actively participating in the public discourse. If we want our research to drive meaningful change, we must step beyond observation and into action.

 

[1] J. Austen, Pride and Prejudice, London, U.K:Ruskin House, 1894, [online] Available: https://www.gutenberg.org/cache/epub/1342/pg1342-images.html.

 

Author Information

Ketra Schmitt is an associate professor at the Centre for Engineering and Society and an associate member at the Concordia Institute for Information Systems Engineering, Gina Cody School of Engineering and Computer Science, Concordia University, Montreal, QC H3G 1M8, Canada. She is also the Editor-in-Chief of IEEE Technology and Society Magazine and serves as a board member for the IEEE Society for the Social Implications of Technology. Email: ketra.schmitt@concordia.ca.

Pamela Tudge is a Canadian scholar whose recent research takes an interdisciplinary approach to studying domestic food waste in Canada. She is a research associate at the Systems Risk Laboratory, Concordia University, Montreal, QC H3G 1M8, Canada, and a lecturer in the Department of Sociology and Communication Studies at Alexander College, Burnaby, BC V5H 4T6, Canada. Tudge has a PhD from Concordia University, where she examined food waste systems through the lens of women’s domestic histories, domestic design history, and pedagogical models for critical design approaches to waste. For over 15 years, her work has explored and critiqued cartographic and communication technologies to enhance public literacy on sustainability issues in Canada. She is an assistant editor for IEEE Technology and Society Magazine.

________

To read the full version of this article, including references, click HERE.

________