The Digital Transformation and Modern Indentured Servitude

By on June 22nd, 2022 in Articles, Artificial Intelligence (AI), Editorial & Opinion, Human Impacts, Magazine Articles, Privacy & Security, Social Implications of Technology, Societal Impact

It has become increasingly conventional to open a workshop or conference with an acknowledgment of the country, as an appropriate gesture of recognition for the traditional owners of the land hosting the event. During and post-pandemic, with the prevalence of virtual events, it is quite likely that someone, somewhere, will be connecting from land no longer “belonging” (in a purely base, legalistic sense) to the traditional keepers.

Therefore, the acknowledgment of the country serves a significant purpose in raising awareness and drawing attention to a deep injustice that persists to this day, with the hope that it will one day be redressed [1]. An honest appraisal of history, and an understanding of where wealth was extracted and privilege established, is an important prerequisite to establishing a more equitable world, as opposed to fabricating a “war on woke” on those that have the temerity to challenge an orthodox, white-washed account of colonial exploitation, oppression or atrocities (see [2], [3]).

On a different scale, and if it were at all possible, it would be good if whenever a client connected to an http server, or indeed any app connected with a central server, the server responded with a corresponding acknowledgment of data, along the lines of “Before we begin our session this morning, I would like to acknowledge the traditional owner of the data which is being transferred, and respect rights to privacy, identity, location, attention and personhood.”

The analogy being pursued here is that to some indigenous peoples, the idea of “someone owning land” is akin to asking “what is north of the north pole,” or “what is outside the universe”: conceptually, it just does not make sense. In a less possessive frame of reference, rather than land belonging to people, the people belonged to the land. The idea that someone could erect a fence and declare “this is mine,” or even worse “this is the Queen’s” (“royalty” being an even more citizenship-demeaning social construct than land ownership), that everything which grew or lived within that fence was also theirs and that someone else could go as far as the fence and no further, was essentially nonsensical.

Those with assets are paid to charge their vehicle during storms, while those without assets, in energy poverty, must choose between heating their homes or heating their food.

This inversion of a natural relationship is informative, if instead of “land” one thinks of “data,” especially data in the context of human beings making the transition from data processor, to both data processor (through sensory perception) and data generator. And by data generator, this is not just a work of mind as a product of conscious creativity that actually provides rights in the form of intellectual property, but also: every action that someone takes; where a user’s attention is given (e.g., through eye-gaze tracking); where someone is going and with whom they have been in contact (Covid-19 tracing apps); their every interaction on social media (e.g., sentiment analysis on Twitter feeds); their presence is observed (from CCTV to monitoring software online) or identity inferred (e.g., just from their open windows or tabs); and for almost every bodily function a sensor has been developed to measure it and track it, voluntarily or otherwise [4], [5], whether someone is awake or asleep.

The consequences of this inversion, as we experience the transformative process of the digital society, are twofold. First, there is the well-established digital divide: between those who are connected and those who are not. This inequity was already manifest according to age, employment, geography, and educational attainment, but also according to assets, where the Matthew effect (the rich get richer) is very much in evidence. For those with assets, such as roof-mounted domestic wind turbines and electric vehicles, they could be paid to charge their vehicle during storms, while those without, suffering from energy poverty, have to choose between heating their homes or heating their food. This divide was also made starkly apparent among the U.K. children during pandemic lockdowns: those who had access to personal laptops and unlimited broadband connectivity could maintain a level of education; those who had to share equipment and/or were on “pay as you go” data contracts were significantly disadvantaged.

The second consequence is that even overcoming the digital divide exposes what might be called the “digitally divided”: the extraction and appropriation of data, as with land before it, creates a two-tier system, between those who have power in the Digital Society, and those who do not [6]. One example has already been given here: some have the power to put a digital fence around data and say that this is theirs, despite this data being generated by someone else, or rather by many someone elses. It may be that the data is exchanged for a location-based service of some kind, but the exchange is hardly symmetric: the value of the aggregated data is far greater than the sum of the value of services provided [7].

This is not an equitable exchange, but is simply digital extraction, and there is no social justice or civic dignity in this. But we can also point to seven more examples of the asymmetry of power in the Digital Society, relating to platformization, community marginalization, gatekeeping of knowledge, addiction by design, the iron triangle of BigTech–parliament–academia, differentiated pricing, and under-employment. Each of these examples will now be briefly discussed in turn.

Platformization is how the network effect at the application layer of the Internet has resulted in what is called the platform economy [8]. In this economic model, work is done at the edge but revenue accrues in the center. This effectively gives power to the platform owner, in particular, power over the means of social coordination. This is exacerbated if the platform owner is a transnational organization not subject to national taxation; it means that some percentage (and it may be as much as 20%) of every transaction is extracted from a local economy and aggregated elsewhere. The neo-colonialism [9] afforded by digital platforms can result in suppression of economic activity in the long term, and the displacement of that economic activity as a platform for qualitative values. (A valid response to neo-colonialism might be called neo-Marxism, because if Marx rejected the idea of private ownership of the means of production, then neo-Marxism rejects the private ownership of the means of social coordination. Public ownership of services considered to be essential to collective well-being or functioning is one a way of restoring an element of social justice in the Digital Society.)

We can point to seven examples of the asymmetry of power in the Digital Society, relating to platformization, community marginalization, gatekeeping of knowledge, addiction by design, the iron triangle of BigTech–parliament–academia, differentiated pricing, and under-employment.

Marginalization of communities is as possible in the digital world (or worlds) as it is in the analog world, including marginalization of the very idea of community and collective action itself [10]. Marginalization comes in many forms. One form of marginalization is people not being empowered to take control over local situations. All digital systems are eventually embedded in the real world, but all too often smart cities projects are not designed from the lived experience but driven by a technological and financial imperative whereby inhabitants are fragmented and reduced to locked-in potential revenue streams. A parallel has been drawn with Weil’s [11] identification of how people’s roots in community, school, work, and family were withered in the interwar years 1919–1939, and how digital technology and political malfeasance have served to de-root and marginalize people in the 21st century [12].

Another form of marginalization is of the disadvantaged: in particular, one such group is the young [13]. From the diminution of their education in critical thinking through to the archiving of their youthful indiscretions and onto digital dependence through persuasive and “pre-suasive” technology, they are rendered powerless on two fronts: privacy and attention. As a data generator, a person should have a right not to be observed; however, a person should also have the right not to be remembered. Social media archives have become the equivalent of dustbins for mud-raking journalists looking for compromising material that can threaten anyone’s career prospects.

As a data processor, a person should have a right not to be interrupted, and the right to be able to focus their energies on socially productive purposes [14]. However, as the third example of asymmetric power in the Digital Society, we observe an increasing use of psychology, neuroscience, and neurobiology in addiction by design. This occurs in many ways: this includes online gambling and keeping people “in the zone” [15]; computer gaming that includes many of the techniques common to gambling, for example, through the use of loot boxes; and interface design based on behavioral models intended to capture and keep attention [14], [16].

In the Digital Society we observe an increasing use of psychology, neuroscience, and neurobiology in addiction by design.

The fourth example of power asymmetry is the gatekeeping of knowledge, which has shifted away from traditional gatekeepers like the universities and the mainstream press. While this has had some beneficial generative effects, such as the creation of Wikipedia, there may be some degenerative consequences too. One of these is the dual ownership of information sources and information reporting. It has been shown how vested interests have used the scientific method, founded on doubt, against themselves [17]. The basic process goes as follows. The first step is to set up an Astroturf Institute, call it something like Foundation for Global Public Health, say. The second step is, under its auspices, to publish a white paper saying, taking some examples completely at random, that hydroxychloroquine or ivermectin are treatments for some virus or other. This is never published in peer-reviewed journals, but is reported in traditional press, their online presence, and social media (with a deferential genuflection to the institute’s grandiose title, a courtesy not extended to “real” academics of a contrary opinion). Then a proper scientist has to debunk the white paper and recommends getting vaccinated instead. This work is published in a proper journal but, unlike the original paper, it is not reported in online media. It turns out that the owner of the Astroturf Institute and the social media are one and the same.

The fifth example is the iron triangle of mutual support between BigTech, congress/parliament, and browbeaten academia [18]. Congress passes the law on taxes and underfunds academia, and therefore BigTech has the money, so it funds the political campaigns of those who make such laws and the research programs of the less stroppy academics. These academics are then successful, but will not hold the politicians to account and might exercise undue influence on appointments and promotions panels, all the while churning out equally unstroppy PhD students who still have not been taught critical thinking skills or sufficient ethics, so, therefore, they are smart enough to operate the equipment but not so smart that they will start asking any difficult questions.

The sixth example could be the misuse of differentiated pricing, if it is used to exclude people, or social segments, from certain services. For example, in the U.K., the “public” schools are indeed open to anyone, anyone who can afford the exorbitant fees. Cash is a great equalizer: however, cryptocurrencies might not be. Therefore, differentiated pricing might simply provide another form of selectivity: for example, there is evidence of the U.K. government using nudge being used to bring about behavioral change through covert targeted advertising campaigns on social media [19]. There are serious ethical issues with such an approach to “behaviorist governance,” but not least that by avoiding all forms of public scrutiny and debate about its policies, a government is less likely to be held accountable. Behaviorist governance in conjunction with performative governance [10] opens up a disturbing new relationship and dynamic between citizens and the state.

The precarious nature of self-employment icombined with the desolation of the community, results in a fearful, compliant, exploitable, and expendable workforce.

The seventh example is not unemployment, the scourge of the 1980s, but under-employment in the gig economy that aligns conveniently (for some) with the platform economy. The precarious nature of self-employment in this economic model combined with the desolation of the community (in this case, the workplace community otherwise known as a “union”) results in a fearful, compliant, exploitable, and expendable workforce. Workers on the margins like this may find themselves alternating between “gigs” or piecemeal work and benefits. If benefits start to be paid in a fiat cryptocurrency, then there is the potential to control where and on what the money is spent: this form of control has been called techno-feudalism [18]. But it does not end with the “precariat.” If Graeber’s [20] analysis is right and that many “white collar” jobs are also essentially pointless, then techno-feudalism underpinned by universal basic income in a world of ever-increasing automation and artificial intelligence [21] simply turns the function of the state to be the overseer of wealth transfer by channeling money that it “prints” (or in the case of a fiat currency, issues though a central bank) from the people, through these pointless activities, and into the accounts of the platform owners.

These seven examples all have one common feature: there is an asymmetry of power in the Digital Society which is replicating the same asymmetries in digital rights as experienced in the analog world with respect to land rights of indigenous peoples. Indeed, the asymmetry is already so deep, entrenched, and nearly irrevocable that it might be called out for what it really is: modern indentured servitude. But once we know something for what it is, perhaps something else can be done about it.

__________

To access the full version of this article including references, click HERE.

__________