The Currency of the Attentional Economy: The Uses and Abuses of Attention in Our World

By on March 31st, 2022 in Articles, Editorial & Opinion, Ethics, Human Impacts, Magazine Articles, Social Implications of Technology, Societal Impact

We live in an attention economy [1]–[2][3]. People vie for the attention of allies, customers, followers, and mates [4] with advertisements, memes, and websites acting as proxies [5]. This truth defines the digital age. As Jeff Bezos noted in 1997, “capturing mindshare on the Internet is extremely difficult … Attention is the scarce commodity of the late 20th century” [6]. Like any resource, attention can be monetized. For instance, a single product placement can net a social influencer with 100,000 followers $ 5,000 [8], [9]. Similarly, ads can be promoted on search engines and social media, with Google estimating that companies receive $2 for every dollar spent promoting an ad [7].

People vie for the attention of allies, customers, followers, and mates [4] with advertisements, memes, and websites acting as proxies.

Despite the precious nature of attention, few consider the composition of the coin of the realm. While there are many currencies (divided, switching, or vigilance [10]–[11][12]), I will consider attention capture and its bidirectional relationship with memory.

Attention and Memory: Silent Partners

Attentional capture is automatic. Attention can be captured with “click bait” headlines [13] and smartphone notifications [14]. Misinformation and disinformation often grab our attention due their relative novelty (i.e., a distinctiveness effect) and we can spread it without fully considering the content or source [28].

By capturing attention, we create a memory trace that influences our representation of the world. Like pennies in a jar, these memory traces can accumulate unnoticed over time. They create pathways that reinforce stimulus–response connections with each successive encounter [15], [16]. Crucially, these memory traces guide attention automatically to relevant features of the environment [17]–[18][19] and free us to perform other tasks [20].

Like any resource, attention can be monetized.

Memory retrieval is also automatic [21]. We never choose to forget our keys, our password, or the answer to a test. Rather, that information is simply not retrieved at the right place and time. We might assume that our most cherished attitudes, beliefs, and values are qualitatively different than everyday memories. To our memory systems, they are not. For instance, attention to appropriate social norms can reduce prosocial behavior [22], with reminders of norms increasing prosocial behavior [23], [24]. And much like any resource, we can deplete our finite supply of attention [25] leading to reductions in prosocial behavior [26].

A partial explanation for these findings is that we tend to be “cognitive misers,” hoarding attentional resources until we are motivated to perform a task. Distributing cognitive processing to mediating technologies facilitates this process: offloading seemingly trivial tasks such as reminders of important dates, calculations, and information queries.

Dark Patterns of Design

Reliance on technology can be potentially costly. The rapid spread of misinformation and disinformation in our time demonstrate this quite clearly. A major source of this problem can be attributed to basic features of memory. When presented with information, we tend to store it and its source as separate representations, that is, a failure of source monitoring [27]. By failing to attend to the source, disinformation can be stored along with information, making it difficult to distinguish the good penny from the bad penny. And bad pennies always turn up. This can lead to “sleeper effects,” such that information that is neglected now can influence our later judgments [32]. Even when a computer is the source, studies have demonstrated that we can confuse its productions with our own [29]. Search engine use can also create the same effect: when search engines are available, we tend to recall less information ourselves while still assuming that we possess this knowledge, that is, the “Google Effect” [30], [31].

By capturing attention, we create a memory trace that influences our representation of the world.

These insights can be used for good or ill. Designs can intentionally exploit our limited attention resources. “Dark patterns” reflect the ethical disaffordances of technology [33], [34], nudging us to perform actions that benefit others, potentially at our own expense. They can include hidden costs, deceptive marketing, questionable testimonials, or using privacy policies and end-user license agreements as “click wrap” giving organization unrestricted use of our information [35], [36]. This is particularly true of social media applications that are designed to promote personal disclosure. Behind the benign interface, users’ attention and preferences are being carefully mapped. With little warning, data subjects can also quickly lose control over their identities with sudden changes in policies, such as a recent TikTok decision over the collection of face and voice data from its users illustrates [37].

The Cambridge Analytica scandal revealed the potential scope of such dark patterns. Using psychographic techniques that can accurately predict user attributes [38], Cambridge Analytica attempted to target and nudge users to influence their voting behavior [39]. Yet, despite public uproar due to perceived violations of privacy and trust, little changed: one survey suggested that most users did not change their behavior on Facebook (58%). Of those that did change their behavior, only 9.6% deactivated their account, while only 19% reported making fewer posts to Facebook and 24.8% indicated that they were more careful [40]. Of course, nudging is a cheap trick, only working when attention is limited [41]. Users might discount these attempts as exerting little influence on their behavior. For instance, a recent study revealed two networks of over 2,000 bots that produced over 65,000 tweets had little influence [42]. We must nevertheless consider sleeper effects and how they can affect us in the long run, if only subtly.

By failing to attend to the source, disinformation can be stored along with information, making it difficult to distinguish the good penny from the bad penny.

Rather than focusing only the tail of the coin, we must also consider its head. By understanding the possible failures of attention, designers can help users invest their attention wisely [43]. Outside of virtual environments, collision and proximity warning systems can help direct a driver’s attention at critical decision points [44], [45]. Inside digital environments, behavioral nudging can be used to encourage citizens to vote [46], [47] and provide warnings about sharing sensitive content or content without reading the article [48], [49]. Twitter has suggested that these practices increase article reading by 40% [50]. Large-scale efforts are also necessary. Apple’s recent App Tracking Transparency policy was directed toward making users more aware of when their information is being shared, a move that proved to be costly for social media platforms [51].

Changes in policies and design nudges are only the first step. Users must be empowered through education and knowledge translation. They must understand the economics of attention to halt the inflationary processes that define big data.

Author Information

Jordan Schoenherr is an Assistant Professor with the Department of Psychology, Concordia University, Montreal, QC, Canada, and an Adjunct Research Professor with the Department of Psychology and a member of the Institute for Data Science at Carleton University, Ottawa, ON, Canada. Email: jordan.schoenherr@carleton.ca.

_____

To read the full version of this article, including references, click HERE.

_____