Emulated Empathy and Ethics in Action: Developing the P7014 Standard

By on March 28th, 2021 in Articles, Artificial Intelligence (AI), Social Implications of Technology, Standards

Introduction

In 2019, IEEE Working Group P7014 began efforts to develop a ‘Standard for Ethical Considerations in Emulated Empathy in Autonomous and Intelligent Systems’, one of fifteen IEEE P7000 ethical initiatives. Ambitious in scope, the mandate of P7014 is to define a model for ethical considerations and practices in the design, creation and use of “empathic technology”, a catch-all term for systems that have the capacity to identify, quantify and interact with affective experiences, such as emotions and cognitive states. During the past two years, our group has worked to create a rubric for ethical safeguards when designing and testing applications that function in close relation to intimate information about people.

This article explains the work, its importance, our achievements to date and ongoing work, but our focus here is mainly on the process. As many readers of this article involved in standards development will know, it is a messy process. Given geographically and chronologically dispersed collaboration, even baseline definitions of terminology provide challenges, yet scope for fertile discussion.

Importance of the standard

Despite misgivings about modern methods, the premise that quantifying technologies will increasingly engage with qualitative dimensions of human life is not likely to disappear. We believe that systems able to simulate empathy are likely to become a feature of everyday life. This makes the P7014 standard important because, for industry, it provides normative requirements and tools by which developers may assess their own product development. Policymakers are already trying to come to grips with these technologies, so we hope that ethical suggestions within the P7014 standard may advise policy development and governance. Foremost are the efforts to resolve two distinct objectives: clarifying ethical issues to help protect all stakeholders (including those who typically do not have a voice in how technologies are built and regulated), and establishing frameworks and processes for ethical design and assurance.

Progress and achievements to date and ongoing work

As of December 17, 2020, we had 28 voting members and 85 interested participants. To address emulated empathy, we have Sub-Groups exploring: Use Cases; Ethical Considerations of Technology (covering the main principles of Rights); Stakeholders & Definitions; the creation of an Ethical Explainability Toolkit; and Outreach.

Getting an ethical standard “right”

All involved with standards work will appreciate the role of debate, exchange, ability to explain, and (foremost) listening. For a standard that addresses emerging technology, possible uses, and ethics, getting this “right” is far from straightforward.

The group generally agrees that it is not our place to prescribe a specific set of ethics that are exclusively “correct” in every set of circumstances. However, we also recognise that if our ethical guidance is too open to interpretation, the standard may lack teeth in the real world or be passed off for “ethical whitewashing”. This balancing act has led to us to write the standard with a blend of normative guidance and tools for self-assessment.

Lessons learned so far

We offer a few insights and lessons gleaned along the way. Interestingly, not all are focused on the subject matter, emulated empathy.

  1. Listen! When working as part of an interdisciplinary team, listening is productive. It is an asset that is essential for a robust standard.
  2. Common language. Internally we are developing a dictionary of technical terms needed to collaborate amongst the group. When deployed externally, the common terminology guide should prove useful to the world.
  3. A multi-domain approach. Non-technical people now more fully appreciate the practicalities and everyday realities of working with empathic technologies. Conversely, people with specialties other than ethics have grown to appreciate the diversity of approaches to ethics and their role in the design process.
  4. Specialization and diversity within sub-efforts. Individuals may gravitate toward work that aligns with their ethical or technical specialty. Their leadership is important on these, but we have found that dissenting voices bring value and new perspectives.
  5. Diversity of voices. Being proactively inclusive is “a work in progress” for P7014 and we suspect the same is true for other P7000 Working Groups. This involves active analysis regarding questions of race, ethnicity and identity, but also anticipating physical and cognitive impairments. As an absolute minimum, we need to be transparent about our make-up, so others can judge the norms likely to inform our perspectives. We support and relish participation in any IEEE initiative to increase diversity of representation.
  6. Ideological viewpoints. With members coming from many backgrounds and regions of the world, we each bring not only views, but fundamentally different perspectives. Some are political, some are epistemological. Some members hold that universal ethical answers may be found if only we could clarify questions about the nature of problems. Others hold that ethics are contextual, and understood as local or social matters. These differing viewpoints have not been reconciled, but the fertile discussion identifies and tackles implicit and explicit assumptions made by each perspective.
  7. Use-cases help. Like AI, emulated empathy is a broad concept, involving a range of technologies underpinning systems that sense, learn and interact in novel ways. Given the breadth of applications, an inductive approach beginning with specific use cases can generate valuable insights in relation to who is affected (positively or negatively), how and where.
  8. Target audience and stakeholders. Standards have historically existed to provide agreed principles of practice to select audiences, for example, hardware or software developers. With social or ethical concerns there is a wider set of stakeholders to consider, not just designers but users, policymakers, society at large, and possibly marginalised groups. More broadly, the socio-technological paradigm bequeathed on future generations needs examination.
  9. Testing. Even a list of proscriptions has to be tested for applicability and usability. We have proposed a project to IEEE to consider specific focused user groups (for example, skilled AI/ML developers and data scientists) to fine-tune the standard to improve acceptance once published.
  10. Dissemination of Standard. We learned from other Working Groups that having a plan for disseminating the standard is as important as creating the standard itself. Hence our Outreach subgroup was created, to explain our work, identify collaborative links with other groups inside and outside of IEEE, and engage wider and more diverse input.

Finally, although challenging, the work is highly rewarding. Every member of the Working Group feels that they are contributing to something significant. We hope you will look into our work.

For more information about ongoing activities, please visit https://sagroups.ieee.org/7014/

This document solely represents the views of the P7014 WG and does not necessarily represent a position of either the IEEE or the IEEE Standards Association.

Contributors (alphabetically)

  • Aladdin Ayesh
  • Ken Bell
  • Karen Bennet
  • Ben Bland (Chair, P7014 Working Group)
  • Lubna Dajani
  • Angelo Ferraro
  • Khan Iftekharuddin
  • Faiz Ikramulla
  • Mathana
  • Andrew McStay
  • Tem Olugbade
  • Randy Soper

Image: P7014 Working Group Logo (multicoloured circle represents the diverse voices of the working group, with a grey colour for our new AI partners, who are entering the human circle)

Published by Miriam Cunningham