These remarks were delivered at the close of the 9th Workshop on the Social Implications of National Security (SINS16), Melbourne, Australia, July 12, 2016.
We have heard and refleeted here at the 9th Workshop on the Social Implications of National Security (SINS16), on a diverse range of issues, presented by an excellent array of speakers, from Prof. R.E. Burnett’s insightful presentation on human information appliance schema, Prof. Donna Dulo’s visionary quandary of “NanoTech and the Military: National Threat or National Security?,” to considering some of the non-technical aspects of our subject as presented by Dr. Rain Liivoja in his informative talk on “Implantable Technologies in the Armed Forces: Some International Law Considerations,” Tim McFarland’s searching presentation on “Can Implants Be Weapons Under the Law?,” and the challenging views of Dr. Rob Nicholls on “Implanting Military Rights and Wrongs.” My comments here by no means detract from the valuable contributions of the other speakers not mentioned.
I question if technology has progressed to the point where the human condition is now the limiting factor — physically and psychologically.
As to my own thoughts, I question if technology has progressed to the point where the human condition is now the limiting factor – physically and psychologically. I recall a presentation I attended here in 2014 by Prof. David Forbes, Director of the Phoenix Australia Center for Post-traumatic Mental Health, where he presented post-traumatic stress disorder (PTSD) as a significant factor for U.S. Air Force pilots conducting drone operations. We discussed then whether the human-in-the-loop could become the limiting factor in such operations. Reflecting on that discussion in the context of this Workshop, I suggest we are facing the emergence of a new condition, which I call “cognizant stress disorder” (CSD) – the fear of an expectation to perform, or of the unknown effects, as a direct result of being fitted with an implantable technology. This of course is directly opposite to the alter-ego possibility that the “host” takes on a state of “invincibleality” (my word), or a “superhero” (i.e., invincible) mentality, which may manifest itself in unintended reckless or careless behavior. Both of these conditions are symbolic of how human thought processes might react to a host being “enhanced” by an implantable technology.
In the military sector we must question whether we are indeed witnessing the emerging age of the human robot.
So I question, are we about to go — or have we already gone — beyond what we refer to as C3 — Command, Control and Communications? I suggest we are fast approaching C9 or greater. Let us consider a C9 schema in the human condition context. If C3 is as traditionally defined above, then:
C4 is Cyber security — if we embrace C3, we then also need to ensure that implantable technologies-and their hosts — are secure, especially from cyber-attacks.
C5 is Capability — what is the real capability we are trying to build in the military context; a super-athlete, a super-spy, etc.? This raises a myriad of ethical questions relative to intent.
C6 is Capacity/Condition — there must be a “body solution” to match the technology solution. Having been “enhanced” by the implantable, does the “human appliance” (as R.E. Burnett described the host) have the physical capacity (body strength, muscle tone, etc.) and mental condition to perform to the expectations expected to achieve the mission. If not, this could give rise to CSD beforehand, failure during operation (through physical exhaustion or mental stress), or PTSD post-event.
C7 is Consideration (or Consciousness) — where C6 addresses the “body solution,” C7 addresses the “personality solution.” Is the “human appliance” a good match for the implantable and the intended result?
C8 is Compromise — what level of freedom (i.e., autonomy) is given to the “human appliance” to exercise judgement over the effects (and commands) of the implantable?
C9 is Confidence — the “human appliance” must have confidence in oneself and confidence in the technology of the implantable to achieve the tasked mission, which plays back to vulnerability and the superhero mentality (i.e., C4, C5, and C6).
As we delve into the layers of the onion that represent the socioethical implications of implantable technologies in the military sector, we must question whether we are indeed witnessing the emerging age of the human robot. If so, then we are also rapidly approaching the age of the human robotic soldier, where the boundaries of “enhancement” and “manipulation” become obscure in terms of the level of human autonomy — which raises ethical questions that are not beyond our technological capability. For example, a soldier voluntarily agrees to host an implantable technology that results in involuntary actions and results.
Also, other serious issues need to be addressed; for example, detectable versus undetectable applications. While our home command will want to trace and interrogate coalition soldiers wearing implantables, for obvious reasons they will not want them to be detected by enemy forces. Because many technology innovations and applications born out of military space and programs eventually find their way into the public space, we also need to consider other applications, and the ethics and national security implications that come with them, including impacts of scale and manageability, legal accountability of actions, etc.
These are just some of my thoughts stimulated by today’s Workshop, which I trust you enjoyed as much as I did. Congratulations to Katina Michael and her team on organizing this event to explore such a controversial and timely topic.
Philip Hall is Principal Fellow, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne Australia.