“It Sets Boundaries Making Your Life Personal and More Comfortable”: Understanding Young People’s Privacy Needs and Concerns

By on March 17th, 2023 in Articles, Human Impacts, Magazine Articles, Privacy & Security, Social Implications of Technology, Societal Impact

Rys Farthing, Genevieve Smith-Nunes, Teki Akuetteh, Kadian Camacho, Katja Koren Ošljak, and Jun Zhao


Children and young people are prolific digital users, making up a third of the world’s online users [1] and engaging with the digital world in different and distinct ways. However, their unique understandings and perspectives are often not considered in debates and discussions around privacy and security [2]. This article outlines a youth-centric notion of digital privacy and guiding principles around privacy developed by young people from Antigua and Barbuda, Australia, Ghana, and Slovenia.

Young people’s privacy concerns are real, but underexamined. While there is extensive research exploring children and young people’s potentially harmful experiences within the digital context, such as dangerous content, grooming, and abuse or risky online behaviors like cyberbullying to excessive “screentime,” there has been a far more limited focus on exploring children and young people’s privacy concerns [3]. Previous research has often identified that young people hold unique ideas about privacy [4], [5], identifying privacy as a right that they feel is often challenged or threatened [6], and that there is a gap in research around young people’s perspectives around commercial contexts, which is rapidly important in the increasing commercially digital environment [7].

Young people’s privacy concerns are real, but underexamined.

Understanding children and young people’s privacy concerns regarding commercial digital products (be online platforms or personal digital gadgets), and what they want to be done about them, is an important task. Children and young people are unique users of the digital world, and there are at least three reasons young people’s privacy concerns may be distinct from adults. First, the ways that young people are datafied and the nature of their data footprint is distinct from adults. Today’s young people are the first to be datafied—to have their lives turned into extracted data points—from even before birth [8]. They carry a whole lifetime’s worth of data (and privacy concerns), shared with a fluid mix of state and commercial companies that young people may be unaware of. Many of these data holders may not be committed to young people’s best interests, which leads to our second distinction: this data can be used by young people in ways to shape their lives and life chances in ways that current adults have not experienced [9]. For example, when the high school leaving exams were canceled because of COVID in the United Kingdom, young people’s postcode data—a marker of socioeconomic status—was used to downgrade university entrance scores of young people from lower income neighborhoods [10]. This was reversed after a public backlash in this instance, but the algorithmic governance of future generations from before birth remains possible if not inevitable. Lastly, the scale and pervasiveness of this datafication have gone hand in hand with the normalization of surveillance and datafication [11]. Young people have grown up in a different age of “dataism,” or a new social conviction that data flows with corporations is normal [11]. Given this, young people may have different perspectives about privacy and data security that are worthy of consideration.

If technologists and policymakers aim to develop solutions that serve all members of the community, understanding how children and young people think about privacy, and what they think should happen with their personal data, is therefore critical. Ensuring effective privacy protections and information security protocols that meet the unique needs and expectations of young people is important for their flourishing and safety, but also to realize a more inclusive democracy for all of us.

This article aims to fill some of this gap, by providing an analysis of privacy, especially privacy in a commercial digital context, from young people’s perspectives around the world. It also makes a critical contribution to relevant knowledge by developing a set of guiding policy principles about privacy from young people’s perspectives that may be important to both technologists and policymakers working on privacy protection and information security.


Using mixed methods, we engaged young people aged 10–18 years old in four locations: 1) Accra in Ghana; 2) multiple places across Slovenia; 3) Saint John’s in Antigua and Barbuda; and 4) Sydney in Australia. These sites were chosen to: 1) allow some international comparisons between young people’s perspectives emerging from Western, Educated, Industrial, Rich, Democratic (WEIRD) contexts, and non-WEIRD contexts and 2) engage with emerging policy and technological debates about privacy and trustworthy tech that were happening in each location. The impact was prioritized in this research, and the aim was to generate insightful, youth-led discussions capable of catalyzing change in each location.

Below, we discuss the common and divergent features of each site, to help understand the nature of the data captured.

Common research methods

This research began from an understanding that the perspectives of young people are valuable but underrecognized. Given this, we use participatory methods, allowing young people to shape the research agenda. We also engaged a mixed method, using many youth-friendly activities, like hypothetical and “real” scenarios, which have successfully been used in research around young people and privacy [4], [5].

The ways that young people are datafied and the nature of their data footprint is distinct from adults.

Qualitatively, a series of deliberative workshops were held, with activities designed to culminate in the creation of a set of principles that participants felt should govern the use of their private information. These workshops included four key activities as follows.

  1. Workshopping and discussing their data footprint, including what sorts and how much personal data about young people was collected in their current digital environment. Activities included brainstorming, drawing data heat maps, small group discussions, and often “immediate technical investigations” such as logging into an app and seeing if it was tracking their location.
  2. Discussions and deliberations about how this data was currently processed, including debates around if these processes and processors were trustworthy. This often involved “role-playing” and hypothetical scenarios, such as pretending you were the CEO of a new tech startup. After this, the groups moved on to explore.
  3. What it would take to make these processes and processors trustworthy, including developing an extensive list of “dos and don’ts” with their data. These deliberations were informed by suggestions young people had made in quantitative research, and/or in other research sites, and undertaken using post-it note brainstorming and small group discussions. These “dos and don’ts” were then further refined into a list of principles.
  4. The final process involved discussions, decisions, and actions to connect young research participants with decision-makers in the space, either policymakers or technologists, shifting into the action-research space.

This was supplemented by quantitative polling and surveys.

Antigua and Barbuda

Hybrid workshops were held with 42 young people aged 13–15 in St. John’s, over two days. Young people were recruited from schools across Antigua to attend in person, and a school group from Barbuda attended online. An additional 55 young people took part in a survey circulated throughout schools, aged 13–18.


Hybrid workshops and interviews were held with 12 young people from New South Wales, held across a full day with follow-up online meetings. Participants were 15–17 years old and recruited from a youth work setting. In addition, 506 16- and 17-year-olds were polled by a commercial polling company.


Workshops with 21 13–17-year-olds were held in Accra. Participants were recruited from nine different localities and schools, reflecting different socioeconomic situations. The Ghanaian survey of young people was not completed by the time of this article.


Shorter one-hour in-person workshops were held across Slovenia, with in excess of 15,000 young people mostly aged between 11 and 15 attending, with some a bit older or younger. These workshops happened in conjunction with the release of a popular Slovenian language teen-movie Gaja’s World 2. The pace and scale of these workshops followed a compressed method, without role playing and with decisions around actions being researcher led rather than youth led. This was supplemented by a survey of 948 young people aged 14–18, led by the University of Ljubljana.

Privacy and young people

Many common themes emerged across all the research sites, and these are discussed below.

Commercial considerations were part of the criteria young people were using in deciding which flows they felt were (or not) appropriate.

What privacy means to young people

The definitions of privacy highlighted four key aspects and considerations underpinning the idea of privacy for the young people at the workshops. For these young people, privacy is mentioned as follows.

  • Involved being able to conceal personal information (e.g., “to protect and conceal our information”). Personal information included identifying data about yourself, including and especially contact details, location details, and details as well as information about “naughty things you’ve done.”
  • Was a protective factor from “others” who may want to interfere (e.g., “helps to create boundaries to protect us from interference in our lives”).Discussions suggested that “others,” whose information may need to be concealed from, included States, hackers, friends, parents, and companies. Who your data needed to be protected from mattered, including preventing interference from commercial “others.”
  • Created a sense or feeling of safety, security, and healthy boundaries (e.g., “privacy may make you feel a sense of safety and comfortability knowing your info is safe with you”)
  • Was considered as a legitimate expectation or as a right (e.g., it was described as “a right to protect and or conceal our personal information”).

These definitions align with Westin’s [12] conceptualization of privacy as “control over” information. But Nissenbaum’s [13] alternative contextual definition of privacy—or privacy as the appropriate information flow of information as assessed by each individual based on their preferences and perceptions—slowly emerged.

As discussions around trust and what young people wanted progressed, the understanding that privacy was contextual surfaced. As discussed below, this was particularly visible when young people talked about wanting better things, or to receive more benefits, as a result of the flow of their data. These discussions highlighted that some flows of information were deemed more desirable than others, depending on context. Critically, they also highlighted that commercial considerations were part of the criteria young people were using in deciding which flows they felt were (or not) appropriate.

Do young people trust technology with their privacy

Despite being described as “right” and a positive concept, the workshops uncovered deep misgivings held by some of the young people about their experiences of privacy in the digital world. As one young man put it “the only way I make myself feel private is by just outright lying to myself and telling myself that the information that I get stolen online won’t affect me that much.”

The polling and survey data collected reflects this and highlighted a significant “trust gap” when it came to how young people felt their personal information was handled. We asked young people if they trusted that their personal information would be handled carefully and found that very roughly a third trusted that their personal information was being handled with care, a third did not trust their personal information was handled carefully and the rest were not sure if they distrusted the way their data was handled (see Figure 1).

Figure 1. - Young people’s sense of trust or distrust about whether their personal information was “handled carefully”, by location.

Figure 1.Young people’s sense of trust or distrust about whether their personal information was “handled carefully”, by location.

We asked some young people to help us interpret these findings, and they suggested that the normalization and inevitability of privacy intrusions in their digital worlds may help explain this answer. As one young woman explained “if I was given that question, I’d say ‘Sweet Jesus, like no, I don’t think I trust them with my privacy.’ But at the same time, you know, I’m on every social media that there is so. And I think a large part of that is  not caring.” Another young man outlined “because you rely on it. So it’s not even about whether or not you can, you don’t really have the choice to trust it or not. You just have to use it because everyone else is on it. It isn’t about whether or not you believe in your privacy.” The (broadly) even split in responses may be young people’s way of reconciling the privacy paradox [14] in this survey; as one young Antiguan put it simply “I trust them because they are things I use on daily.”

What young people want to improve their privacy

Privacy intrusions in the digital world may have felt inevitable or normal, but they were not necessarily what the young participants wanted. Young people were able to give us clear ideas about changes they wanted to see and list specific suggestions as well as guiding principles to improve their sense of privacy in the digital world.

The workshops uncovered deep misgivings held by some of the young people about their experiences of privacy in the digital world.

Exploring the list of “dos and don’ts” created by young people, and their guiding policy principles, ten core asks were frequently repeated across areas (and three additional Slovenian asks).

  1. Control: The most frequent principle described as important was that young people should have control over their data. There was a sense that young people’s information was collected, used, and sold in ways that were currently beyond their control and that young people were entitled to better. Specific suggestions ranged from giving young people “the right to delete our data” to “allowing young people or teenagers to make their own decisions with the available options.”
  2. Transparency and meaningful consent: There was a strong sense that young people were often “pushed or tricked” into personal information collection, from opaque privacy policies to smartphones secretly “eavesdropping” on conversations. The young people in Australia wanted cookies renamed “data grabbers,” and the young people in Slovenia wanted “translations of apps and programs” into Slovenian. There was an overarching belief that young people deserved more honest and clear information about how their personal information would be used, especially from commercial platforms.
  3. Restricting or ending targeted advertising: There was consensus that the current state of play—where young people’s information is used to target them with ads by default—did not meet expectations. Young people in Slovenia, Antigua and Barbuda, and Ghana more aggressively called for a ban on advertising, wanting to “cancel ads,” saying “online ads should be banned” and requesting companies “stop sending those ads.” In Australia, the young people instead called for control that young people should be able to opt-in to advertising, and if they did opt-in to advertising, should also then have the choice about whether their data is used to target ads at them or not. Notably, they called for this rather than a full ban because they wanted “realistic” asks; “fundamentally, young people do not want their data used to sell them things” they said.
  4. Data minimization: A range of suggestions were discussed that cluster around the concept of “data minimization.” This included suggestions to “avoid asking for data,” “avoid accepting cookies,” “to not collect data at all,” and that companies “don’t access our location every thirty seconds.”
  5. Data retention: Young people in each site seemed to agree that young people’s information should not be kept for too long, with requests to “delete young people’s data when it’s not needed” especially after an app is uninstalled.
  6. Excessive sharing and selling: There was a feeling that young people’s personal information was shared with too many people, including families, too many apps, and sold to too many companies. Young people wanted data to be more “concealed.” Requests to “Don’t sell our data to others” were others, as were suggestions to limiting “Family link” (an app that shares your location and other data) up until certain ages or banning it altogether.Information about young people’s locations was considered especially sensitive for oversharing, and suggestions to “limit who can see your location” were frequent.
  7. Adequate help and support: In each site, young people felt they should have better access to support if something went wrong, from a “Help portal in case of personal profile hacks” to calls for services to “respond quickly to reports and violations of guidelines.”
  8. Security: Security came up as a concern in each site, but some more than others. Antigua and Barbuda and Slovenia were very concerned about security, while Ghana and Australia were less so. But generally, young people felt their data deserved higher levels of security and protection from bad actors than it currently received. The bad actors that invoked these security concerns were largely malicious hackers. There was less concern about security to prevent inappropriate data flows to governments or commercial security concerns. This again speaks to the importance of context in deciding which data flows were deemed concerning and which were not.
  9. A need for more education for young people about their privacy: At each site, young people talked about the need for more education (often younger young people) about privacy. Suggestions ranged from “talks in schools” to a specialized school subject on the “safe use of the Internet” to more education “about privacy, their rights and risks” in general.
  10. Doing things for young people’s benefit: In Antigua and Barbuda, there was a cluster of specific suggestions to ensure that young people benefited from the use of their personal information. “If you take my data, at least make the app better” and “protect our data for better uses.” Likewise in Slovenia, there was support to make all games free and to operate without data. The capacity of the digital world to provide good services that young people want to use was not overlooked. This speaks to their idea of data and its flows as part of a contextual relationship in the commercial world, or more precisely, highlights that they want a “better deal” from this relationship.
  11. Limitations and restricted use: Given the vast size of the Slovenian workshops, three unique Slovenian suggestions were generated. A moderate focus on the potential capacities of limitations (such as calls for age restrictions, or limiting device use) and reduced use (such as days off and time limits) emerged.
  12. Content moderation: Some Slovenian young people’s suggestions also called for better or safer content moderation, from banning videos that encourage kids to do dangerous challenges, to deleting negative comments.
  13. Parental supervision: Some Slovenian young people also called for increased parental supervision or oversight, although it should be noted that others called for privacy from their parents.

We asked young people if they felt that their list of principles aligned with what could be broadly considered “children’s best interests” as the Convention on the Rights of the Child describes it [15]. The reaction was broadly yes, but with specification. In Antigua and Barbuda, young people gave an unstirred “sure” with a shrug of the shoulders because it seemed a bit too common sense. In Ghana, this was one of the most popular suggestions, and they also included suggestions around “doing no harm” with data too. In Australia, they decided that best interest was the first principle, but needed a caveat, that “young people need to decide what young people’s best interests are.”

Value of engaging young people

Realizing young people’s best interests requires their participation in decision-making processes. And off all the brilliant suggestions developed by young people, one suggestion stood out: “so that children make decisions instead of experts and politicians.” This research actively tried to engage with technologists and decision-makers to achieve this.

In Antigua and Barbuda, the principles developed by young people were turned into a “magazine” for decision-makers, and a workshop was held with teachers to share ideas for curriculum reforms. In Australia, the principles were turned into a submission to an inquiry, and participants gave evidence to Senators. In Ghana, young people identified multiple stakeholders, from ministers to business titans, who they are writing to. In Slovenia, throughout 2023, the young people’s principles are being used to inform an exhibition about young people’s privacy with the information commissioner invited to attend.

The reaction from decision-makers was enthusiastic, but the young people’s input sat outside the change-making cycle. For example, teachers welcomed knowing what young people wanted to learn, but were not in a position to change their core curriculum. Senators and regulators thanked us for the information, but were not in an active cycle of rule-making to deliver the changes the young people requested.

But this does not mean these engagements have no value, just that we may not be able to fully track the discreet impact of these young people’s contributions. For example, in Antigua and Barbuda, staff at the Department for Education began to explore progress against their 2013 information and communication technology in education policy and made an “ironclad commitment to ensure that the recommendations from this conversation are realized” as a forward to the magazine. Or in Australia, the Senator’s inquiry may feed into an ongoing review of privacy law. Or in Slovenia, the information commissioner may engage with the Europe-wide process about children’s data across 2023.

There is an old English adage that suggests that people in positions of power “ignore the youth at their peril.” Ensuring young people’s perspectives are understood and heard is vital to ensuring their right to participate is realized. But it may also be essential to ensure that data privacy and information security practices are improved in ways that meet whole-of-community expectations. If these decision-makers heed these young people’s feedback, it is possible that positive changes could be made for everyone, including young people.

Young people have a clear idea of what their privacy is and what it feels like. Privacy is a legitimate expectation or right to conceal information, from potentially intrusive “others,” and creates a feeling of safety and security. Young people did not trust that their privacy was well respected and noted multiple concerns about the way their personal information flowed in the digital world. These inappropriate practices and undesirable flows were normalized, but not accepted; young people were able to identify ten principles they believed should be enacted to improve their privacy, from more control, transparency to greater support and education. Many of these principles take aim at the commercialization of their personal information.

Involving young people in discussions about privacy and information security is important, if we are aiming to develop solutions that serve all members of the community. The small-scale action-research components outlined in this article suggest that there is an appetite from decision makers to engage with young people. Whether this engagement is used, in turn, to develop better privacy and information security policies and practices remains to be seen. It rests on the willingness of decision makers to share their power and integrate their thoughts and perspectives into their work.


The authors thank those who supported the young people’s involvement; the Y in Australia; Child Online Africa and Awo Aidam Amenyah in Ghana; VSAK Institute in Slovenia; and Department for Education and DadliBots in Antigua and Barbuda. This work was supported by the Internet Society Foundation.

Author Information

Rys Farthing is a research fellow at the Centre For The Digital Child, Deakin University, Melbourne, Vic 3125, Australia. Her Research Interests Include Children’S Rights, Technology Policy, And Privacy. Email: rys.farthing@deakin.edu.au.

Teki Akuetteh was the first executive director of the Ghana’s Data Protection Commission. She is with Africa Digital Rights Hub, Accra, Ghana. Her research interests include data protection and privacy, technology policy, and AU policy.
Katja Koren Ošljak is pursuing a Phd with the Faculty of Social Studies, University Of Ljubljana, 1000 Ljubljana, Slovenia. Her Research Interests Include Children’S Rights In The Digital Environment, Curriculum And Media Education, And Eu Policy.
Genevieve Smith-Nunes is pursuing a Phd with the Department Of Education, University Of Cambridge, CB2 1TN Cambridge, U.K. Her Research Interests Include Data Ethics, Children And Young People, And Computing Education.
Kadian Camacho is a research associate With The Department Of Education, University Of West Indies, Saint John’S, Antigua & Barbuda. Her Research Interests Include Young People, Education, And Caricom Policy.
Jun Zhao is a research lecturer with the Department Of Computer Science, University Of Oxford, Ox1 2Jd Oxford, U.K. Her Research Interests Include Trust And Technology, Privacy, And Children And Technology. See http://www.me-ai.org


To read this full article, including references, click HERE. Note that SSIT membership or a subscription to IEEE Technology and Society Magazine is required to access the full version.