Drones Humanus

By on June 29th, 2017 in Editorial & Opinion

Some years ago, a sweet grandma in my (Christine’s) neighborhood was convinced that one of her neighbors was involved in illegal activity. Although my husband and I tried to assuage her overactive mind, she insisted we purchase and deliver binoculars to enable her to perform her civic duty as a self-appointed sleuthhound.

If it had been this year, she could have placed an order on-line and a drone could deliver the packaged binoculars to her front door [1]. Perhaps next year, she can trade in the binoculars for a perching air drone that will not only fly, but also perform a controlled stall with actuators allowing the feet to grip the branch of the tree in her neighbor’s yard. The bird-like drone, with motors that can shut down to avoid energy depletion, can sit for long periods of time, recording lots and lots of data [2].

The current environment in which these technologies are emerging causes even the more open-minded members of society to face considerable misunderstandings, exploitation, abuse, and even physical danger as was evidenced during The Burning Man festival this past year [3]. Not surprisingly, a plethora of issues arose at the festival pertaining to drones; many of which related to privacy. It is apparent, there are still fragmented, few, or no regulations. Yet advances in technology allow for more easily concealed devices [4], revolutionary capabilities of remote sensing and capture technology (e.g., LIDAR chip) [5], and decreasing costs to acquire devices [6].

What will we become? We can now buy devices to wear 24/7, logging everything we see, and sending data to our lifelog storage device in the cloud. Perhaps we are now the bird-like drone, but we move from the sky, to the branch, to the inside of people’s homes, into their workspaces, and alongside them on roads, trains, and planes. We can capture their interactions, their facial expressions, and the intimate aspects of their everyday experiences [7]. There are seemingly no limits [8].

Much like peer-to-peer security that has proven to be effective in society to reduce disorderly conduct in crowds [9], perhaps people will be paid for drone-like behavior [10]. Perhaps, the sweet, civic-minded grandma in the neighborhood, who lifelogs to pass on a heritage to her progeny, will utilize the same device to capture peer-to-peer data and thereby subsidize her pension. What is the trajectory for society?

If the digital realm plays an ever-increasing role in developing and transmitting social norms, we must consider the many values at stake [11]. The older as well as the younger generations may perceive this as an opportunity to become as fearless as the desert explorers who traversed unknown lands. Only today, the point-of-view #explorers are demonstrating their mean feats to a global theatre using social media in real time to their legions of online followers [12]. Suppose lifelogs lead to an environment in which we are fact-checked against the recorded medium [13]. Your interpretation of an event could be refuted; you would be told, “You were never into jazz.” or “That wasn’t such a good time, was it?” [14]. Can synthesized data, capture the spirit behind the poetic license one takes when telling a story to achieve a desired effect? Can an algorithm discern the varied contexts within which our behaviors were recorded? We often have different personae that change over time; and it is often necessary for an individual to have one personae for work, one for family, and yet another for the Internet [15].

The human experience cannot be captured and interpreted easily; we are highly complex and astoundingly dynamic beings. This is the great stuff of humanity. We are ever-changing. We embellish to affect laughter. We create what didn’t exist. We make stuff up. We make up rules so we can play games. We make up institutions so we can coordinate problem-solving collective action. Data, especially when so abundant and extensive, can easily undermine such invention. One only needs to ask siblings to describe their shared childhood experiences; one could compare their stories and be exceedingly perplexed. Each sibling has created his or her own narrative; each may have invented a slightly different back story. Can algorithms or a fallible human who chooses how to personally interpret the synthesis of data, appropriately process reality [16]? Moreover, if it can be said that “history belongs to the winners,” then while there exists lifelogging asymmetry (some do, some don’t), perhaps it could also be said, equally cynically, that “personal history belongs to the lifeloggers”? Issues arise with this historical record because a lifelogger could easily omit, misrepresent, or even distort and deceive; he or she can willfully create inaccurate narratives which could go unchecked and unchallenged.

The most delightful aspect of visiting that grandma wasn’t the amusing humor derived from her comedic idiosyncrasies, but rather it was her rich storytelling. Her husband often had a different take on events. She admitted she chose to forget the painful aspects of the depression era. Yet, she wonderfully verbalized a narrative of her life and times from her perspective. Just as forgetting is an essential part of the human psyche (without which we cannot begin to function), so is the ability to create narratives [17]. In the event of universal lifelogging, could this be lost and replaced with machine-perfected recollections? Without narrative, we have no mythos, and so we have no more explanation for the human condition than logos. We would have much less ability to create a shared sense of community through a commonly told story, and may be stuck instead with a single unalterable personae deterministically crushed by the unbearable tyranny of mundane facts captured through devices.

There are negative ramifications when we allow technology to commodify social concepts, and diminish social relations like privacy, friendship, and loyalty. The resultant consequences, such as the fragmentation of communities, the dissolution of trust, and the diminution of our ability to solve collective action problems, are serious enough. However, such invasive technologies as wearables and bearables are doing something else: they could deprive us of the ability to create personae and narratives [18]. We may discover the obsessive literalism is an axe being taken to the very essence of what it means to be human.

Authors

Christine Perakslis is associate professor in the Alan Shawn Feinstein Graduate School at Johnson & Wales University, Providence, RI, U.S.A.

Jeremy Pitt is a Reader in Intelligent Systems in the Department of Electrical & Electronic Engineering at Imperial College London, U.K.

Katina Michael is an associate professor in the School of Information Systems and Technology at the University of Wollongong, NSW, Australia.

Katina Michael

Christine Perakslis

Jeremy Pitt

 

 

 

 

 

 

Access full article here.

June 2014 issue

IEEE Technology and Society Magazine