We can perhaps accept Weil’s starting premise of obligations as fundamental concepts, based on which we can also reasonably accept her assertion that “obligations … all stem, without exception, from the vital needs of the human being.”
With techno-feudalism, what is paid and permitted in a digital space is decided by asymmetric power, not mutual consent. Political approval for funding priorities, education programs and regulation all favor Big Tech.
Some collective behavior that supports sustainability entails some individual inconvenience: many small acts of environmental kindness require some thought, effort, or consideration.
Mega-platforms have, with the addition of one extra ingredient, combined lock-in and loyalty to create a grave, and perhaps unexpected, consequence. The extra ingredient is psychology; and the unexpected consequence is what might be called digital dependence.
Democracy itself is under (yet another) threat from deepfake videos … deepfake videos could be used to create compromising material of politicians: for example, the digitally-altered video2 of U.S. House of Representatives speaker Nancy Pelosi appearing to slur drunkenly was viewed millions of times and tweeted by the U.S. President, and although the video is demonstrably a hoax, the tweet remains undeleted.
Technology for Big Data, and its brother-in-arms Machine Learning, is at the root of, and is the facilitator of, deliberate string-pulling design choices. These design choices are made by people, and so the question actually becomes, do the design choices enabled by Big Data and Machine Learning have the capacity to alter, diminish and perhaps actually “destroy” what it means to be fundamentally human.
Why would anyone own, or even need to own, a driverless car, if they do not get to drive it? Which in turn begs the question, if the central tenet of the personal car ownership model (i.e., ownership) no longer holds, then what is the replacement business model?
Politics required dialogue, deliberation, negotiation, and compromise. But now there is a dispute over the facts themselves.
The level of state surveillance practiced in the supposedly illiberal regimes prior to fall of the Berlin Wall is now routinely accepted, from the widespread use of CCTV to online tracking and data recording. Therefore, instead of labeling a display of genuine concern as “paranoia,” perhaps a lack of genuine concerns should instead be stigmatized by a “disease” or a “disorder”: complacentosis, complyaphilia, complicivitis, ignorrhea.
If digital technologies can be designed to maintain or sustain values, then the same technologies can be designed to manipulate or undermine those same values.
The aim of this special issue is to evaluate the social impact and social implications of new and emerging technologies on governance, politics, public administration, and policy-making, and to evaluate the future prospects of digital democracy, and its transformative potential for increasing public engagement, community empowerment, and social entrepreneurship.
It can be glibly asserted that technology makes accomplishing various activities easier. But it is not always obvious for whom it makes it easier to accomplish what. For example, the Internet has had a profound impact on academic publishing, and the transition from printed paper to digital format has ostensibly made it “easier” for academics to put their work in the public domain and, if they can actually get attention in a social-media sound-bite distracted world, reach a wider audience than ever before.
It is necessary to start somehow, even if you’ve got no map, no knowledge of the destination, and no milometer to measure the distance that has been covered. This can sometimes be the essence of collective action for addressing wicked problems. Sometimes human behavior defies top-down direction and even nudge, and begins instead with a single initiating event and snowballs from there.