Social Robots: The Friend of the Future or Mechanical Mistake?

By on September 10th, 2022 in Artificial Intelligence (AI), Commentary, Ethics, Health & Medical, Human Impacts, Magazine Articles, Privacy & Security, Robotics, Social Implications of Technology, Societal Impact

Isolation can manifest anywhere a person chooses to age. But isolation is detrimental to mental health and is compared in health impacts to smoking cigarettes and obesity [1]. While interaction with other human beings is preferable, sometimes one on one interactions are not possible. Older adults who live far from their families, individuals inside assisted living facilities during the COVID-19 pandemic, and individuals who have a limited social circle due to retirement or loss of friends may all feel isolated [2]. Cognitive decline can cause isolation in older adults as well [3]. Social robotics is poised to impact society by addressing isolation and providing companionship for these individuals by augmenting human interaction when none is available.

Social robotics is poised to impact society by addressing isolation and providing companionship by augmenting human interaction when none is available.

An exciting application of this technology is to assist older adults living with dementia and their informal caregivers. The robot could provide conversational therapy to the person living with dementia, a recommended tool by the Alzheimer’s Association [4]. The robot would engage the person in meaningful conversation and help replace the pressure to remember [5]. While the person living with dementia is interacting with the robot, their informal caregiver could use this time to address their isolation or mental health. Informal caregivers often struggle mentally and physically with caregiving, so this would provide them with an outlet to take care of themselves [6]. Additionally, the robot could serve as a therapeutic tool for the informal caregiver when needed.

Assisting people living with dementia and their informal caregivers are the applications our team looks to address. First we wanted to understand how healthy older adults would accept this technology. We did this through a 2×2 Wizard of Oz experiment where we investigated how interactive the robot needed to be to make a conversation feel more natural and reciprocal.

While we are still exploring this question, there are some exciting preliminary results. These include that healthy older adults will accept a social robot and discuss very personal things with it. Some of these conversations include death, careers, and travel throughout the years. Many participants reported feeling the same comfort as they would talking to a friend, or in some cases with specific conversation topics, more comfortable talking to the robot than their friends. Many participants disclosed controversial personal opinions to the robot, and some reported that they told the robot things they had not told their friends.

Healthy older adults will accept a social robot and discuss very personal things with it.

These results show a promising feature for healthy older adults and provide valuable insights into the development of social robots. Participants all reported they would continue to interact with a social robot even though they do not suffer from isolation.

While these preliminary results show promise, it also raises concerns for the future of social robots if precautions and guidelines are not put into place now. One concern is the level of confidentiality. It is easy to imagine a scenario where two people often use the same social robot. If the robot repeats confidential, personal information to the other individual, it could be damaging to the person who disclosed the information.

Developers should go out of their way to create a cost-effective solution to social robots, or they may disproportionately affect people based on their socio-economic status. If we are not careful, social robots could become an elite technology only assisting those who can afford it. In such a scenario, older adults might face tradeoffs between needs such as mental health and physical safety.

It is important to emphasize this technology should not replace human interaction. It is easy to imagine a scenario where an adult child places their loved one living with dementia inside a care facility with a social robot and believes it is okay to no longer interact with their parent. Precautions will need to be set up in environments to avoid this.

 It is paramount to bring social scientists, medical professionals, and end-users together to discuss safety features and address ethical concerns about this technology.

Social robots should have facial recognition capabilities that allow for profiles to avoid repeating sensitive information. The robot should remember previous conversations from previous sessions to build trust and friendship with the robot. These profiles should only be accessible to the person whose account it is, much like a personal streaming device remembers preferences but needs a password to access the account. The only time the robot should report information to the authorities is if the user discloses they will harm themselves or others. Such a standard would ensure that the privacy of the user is maintained during and after their interactions.

The networks social robots are connected to need higher protection from cybercrime such as hacking, especially if facial recognition or computer vision is enabled on the robot. If a cyber-attack happens against a social robot inside a home, it may leak information that could compromise the user. A cybercriminal may record the user without them knowing, or gain access to information that makes the user vulnerable.

An ethical consideration needs to be given to how much information loved ones and professionals receive from the robot. If a user is diagnosed with dementia, loved ones may want access to the conversations to monitor the conversations and how they are progressing cognitively. Additionally, they may desire the option to “drop in” on their loved ones, so they can monitor from long distances. Developers should consider the ethical risks and potential harm to the person living with dementia that these features could cause. It is necessary to understand how and when these features are used; otherwise, they may result in exploitation of individuals and make them feel like they are not in control of their lives.

If developers do not take these precautions and begin to develop this technology without thinking about how it could impact society, this could be one of society’s biggest failures. It is paramount to bring social scientists, medical professionals, and end-users together to discuss safety features and address ethical concerns about this technology. It may be too late to address these concerns if developers wait until social robots become socially acceptable. This technology could benefit society — or in a worst-case scenario, it could further marginalize the groups who need it most.

Author Information

Jordan Miller is currently pursuing a PhD with Arizona State University, Tempe, AZ, USA, investigating how to make social robots converse more naturally. Email: jlmill41@asu.edu.

________

To view the original version of this article, including references, click HERE.

_________