How can local (grassroots) contributive justice be used as a driving force for the common good?

How can local (grassroots) contributive justice be used as a driving force for the common good?
All the deep philosophical questions, starts the joke, were asked by the classical Greeks, and everything since then has been footnotes and comments in the margins, finishes the punchline.
The term “modern indentured servitude” did not originate with this workshop, but we hope that this special issue has highlighted many of the different shapes and processes it can take, some more insidious than others. We would like to think that, if each paper could talk, they would get up one after the other and say, “No, I’m Spartacus.” In these dark times, each of us needs the courage to be Spartacus.
It would be good if whenever a client connected to an http server, or indeed any app connected with a central server, the server responded with a corresponding acknowledgment of data, along the lines of “Before we begin our session this morning, I would like to acknowledge the traditional owner of the data which is being transferred, and respect rights to privacy, identity, location, attention and personhood.”
Functional democratic governance has five fundamental preconditions: civic dignity, confluent values, epistemic diversity, accessible education, and legitimate consent.
If it were possible to formulate laws involving vague predicates and adjudicate them, what would be the implications of such minimalist formulations for soft laws and even for “hard” laws? The possible implications are threefold: 1) does possibility imply desirability; 2) does possibility imply infallibility; and 3) does possibility imply accountability? The answer advanced here, to all three questions, is “no.”
Systems can be designed using methodologies like value-sensitive design, and operationalized, to produce socio-technical solutions to support or complement policies that address environmental sustainability, social justice, or public health. Such systems are then deployed in order to promote the public interest or enable users to act (individually and at scale) in a way that is in the public interest toward individual and communal empowerment.
Just as the “autonomous” in lethal autonomous weapons allows the military to dissemble over responsibility for their effects, there are civilian companies leveraging “AI” to exert control without responsibility.
And so we arrive at “trustworthy AI” because, of course, we are building systems that people should trust and if they don’t it’s their fault, so how can we make them do that, right? Or, we’ve built this amazing “AI” system that can drive your car for you but don’t blame us when it crashes because you should have been paying attention. Or, we built it, sure, but then it learned stuff and it’s not under our control anymore—the world is a complex place.
Understanding the societal trajectory induced by AI, and anticipating its directions so that we might apply it for achieving equity, is a sociological, ethical, legal, cultural, generational, educational, and political problem.
We can perhaps accept Weil’s starting premise of obligations as fundamental concepts, based on which we can also reasonably accept her assertion that “obligations … all stem, without exception, from the vital needs of the human being.”
With techno-feudalism, what is paid and permitted in a digital space is decided by asymmetric power, not mutual consent. Political approval for funding priorities, education programs and regulation all favor Big Tech.
Some collective behavior that supports sustainability entails some individual inconvenience: many small acts of environmental kindness require some thought, effort, or consideration.
Mega-platforms have, with the addition of one extra ingredient, combined lock-in and loyalty to create a grave, and perhaps unexpected, consequence. The extra ingredient is psychology; and the unexpected consequence is what might be called digital dependence.
Democracy itself is under (yet another) threat from deepfake videos … deepfake videos could be used to create compromising material of politicians: for example, the digitally-altered video2 of U.S. House of Representatives speaker Nancy Pelosi appearing to slur drunkenly was viewed millions of times and tweeted by the U.S. President, and although the video is demonstrably a hoax, the tweet remains undeleted.
Technology for Big Data, and its brother-in-arms Machine Learning, is at the root of, and is the facilitator of, deliberate string-pulling design choices. These design choices are made by people, and so the question actually becomes, do the design choices enabled by Big Data and Machine Learning have the capacity to alter, diminish and perhaps actually “destroy” what it means to be fundamentally human.
Why would anyone own, or even need to own, a driverless car, if they do not get to drive it? Which in turn begs the question, if the central tenet of the personal car ownership model (i.e., ownership) no longer holds, then what is the replacement business model?
Politics required dialogue, deliberation, negotiation, and compromise. But now there is a dispute over the facts themselves.
The level of state surveillance practiced in the supposedly illiberal regimes prior to fall of the Berlin Wall is now routinely accepted, from the widespread use of CCTV to online tracking and data recording. Therefore, instead of labeling a display of genuine concern as “paranoia,” perhaps a lack of genuine concerns should instead be stigmatized by a “disease” or a “disorder”: complacentosis, complyaphilia, complicivitis, ignorrhea.
If digital technologies can be designed to maintain or sustain values, then the same technologies can be designed to manipulate or undermine those same values.
The aim of this special issue is to evaluate the social impact and social implications of new and emerging technologies on governance, politics, public administration, and policy-making, and to evaluate the future prospects of digital democracy, and its transformative potential for increasing public engagement, community empowerment, and social entrepreneurship.