Smart AI, Private Lives: Can We Have Both?

By on June 10th, 2025 in Articles, Artificial Intelligence (AI), Ethics, Human Impacts, Magazine Articles, President's Message, Privacy & Security, Social Implications of Technology, Societal Impact

Every time we go online, we are asked to make a small but increasingly frustrating decision: “Do you accept cookies?” It pops up on nearly every website we visit, and though most users click “accept” just to get to the content, these seemingly minor prompts are part of a much larger debate—one about digital privacy and how much of ourselves we unknowingly give away in exchange for convenience.

The concerns and debates about privacy rights have been present for a long time. In the modern age, Warren and Brandeis [1] in 1890 were probably the first to argue for a legal recognition of the individual’s “right to be let alone.” In the pretelecommunications era, their concerns were about invasive journalism. With the advent of the Internet and social media, the debate about privacy rights entered a new phase. Tim Cook, CEO of Apple, was quoted in 2018 as saying, “I consider data protection to be one of the important issues of the 21st century. We need a bill of rights for the digital world!” Conversely, Eric Schmidt, former CEO of Google said in 2010, “If there’s something you don’t want anyone to know, maybe you shouldn’t do it anyway.” These opposing views reflect the tension between innovation and individual privacy—an issue that has only grown more complex.

Today, that debate has entered a new frontier with exponentially growing capabilities of large language models and generative AI. Users have little to no visibility into whether their personal information has been used to train these models or how their inputs across different interactions may be collected, aggregated, or interpreted.

The IEEE Digital Privacy Initiative “focuses on a user-centric perspective—looking at the digital privacy needs of the individuals rather than the security of data, products, and organizations.

That is why our society is participating in the IEEE Digital Privacy Initiative, a program under IEEE Future Directions [2]. This initiative “focuses on a user-centric perspective—looking at the digital privacy needs of the individuals rather than the security of data, products, and organizations—such as providing individuals with user-enabled privacy controls and promoting privacy at the outset of product and service lifecycles.” It envisions a future in which the capability exists to enable any individual around the world to privately maintain presence, data, identity, and dignity online. To help achieve this vision, the initiative seeks the following goals.

  • Bring the voice of technologists to the digital privacy conversation, incorporating a holistic approach to address privacy that also includes economic, legal, and social perspectives.
  • Facilitate cross-disciplinary collaboration to advance research, promote standardization and best practices, and create tools and capabilities to support the privacy needs of individuals.
  • Coordinate efforts across and beyond IEEE with a multicultural lens that is working on different dimensions of digital privacy.

I invite all members to take an active role in this important initiative and help shape the future in which our digital privacy is assured as a fundamental right and where our data are used only with our consent and control.

Author Information

Murty Polavarapu is the 2025–2026 president of the IEEE Society on Social Implications of Technology. He is also the president of Space Electronics Solutions, Oakton, VA 22124 USA, and the managing director of Virginia Microelectronics Consortium, Fairfax, VA 22030 USA. Email: murtyp@ieee.org.

_______________

To read the complete version of this article, including references, click HERE.

_________________