The Complexities of Sociotechnical Architectures
Trust is often understood to be a mental state, or a complex attitude, constructed of a tripartite relationship involving trustor(s), trustee(s), and actions/behaviors, with each relevant for the result of a goal. In this issue of T&S Magazine, our community explored well technologies set against the backdrop of the contextual nature of trust, as well as the impact of technologies on trust construction and trust violation.
Taking into account trust in the environment and in the infrastructure (e.g., the sociotechnical architectures), our community delved into a variety of trust realms [1], [3]. We acknowledged tensions when actors confront high-stakes events within adverse conditions. We wrestled, once again, with the situational complexities around weaponized technologies that terminate hostile targets. We explored how human and non-human actors could strengthen trustworthy decision-making taking into account levels of certainty. We also looked to antifragility systems; we learned how microgrids can be used to rescue victims in life-threating environs, so that trustees can marvelously expedite disaster relief efforts.
Our community examined well how technologies increasingly impact the construction of trust. We looked to artificial swarm robots to improve the modulation of idea flow for prosocial behavior in social networks. We considered time scarcity; we must sometimes forgo the iterated interactions that so often build trust; swift trust [2] can become obligatory. We also exposed technologies utilized for nefarious purposes. Social trust is violated when surveillance technologies fuel clandestine campaigns to construct oppressive systems used to control (and punish) members of society. Social trust can also be eroded when trust-abusing actors manipulate reality with deepfake videos, or if we carelessly relinquish human aspects of caring, thinking, feeling, and dying to non-human actors. We pondered: How will these affect our decisions to trust?
As trustors, we are traversing minefields to ascertain the trustworthiness of human and nonhuman trustees. We work to define the actions/behaviors in rapidly emerging technological spheres in order to ascertain what (and who) is, and will be, worthy of trust — and at what levels. Thus, we perceive: trust emergence [1], [3] becomes far more complex. Because such social constructs as trust remain essential for a robust society, our community must remain committed to exploring such consequences of technology.
Author Information
Christine Perakslis is Associate Professor in the MBA Program, College of Management, Johnson & Wales University, Providence, RI. Email: christine.perakslis@jwu.edu.
To view the complete version of this article including References, click HERE.