With more than 50% of the global population living in non-democratic states, and keeping in mind the disturbing trend to authoritarianism of populist leaders in supposedly democratic countries, it is easy to think of dystopian scenarios about the destructive potentials of digitalization and AI for the future of freedom, privacy, and human rights. But AI and digital innovations could also be enablers of a Renewed Humanism in the Digital Age.
While many of us hear about the latest and greatest breakthrough in AI technology, what we hear less about is its environmental impact. In fact, much of AI’s recent progress has required ever-increasing amounts of data and computing power. We believe that tracking and communicating the environmental impact of ML should be a key part of the research and development process.
In 2019, millions of young people took to the streets demanding “systems change not climate change.” Their call echoes the words of the Intergovernmental Panel on Climate Change (IPCC) Special Report, which stated that “Limiting global warming to 1.5 °C would require rapid, far-reaching and unprecedented changes in all aspects of society.”
Disruptions can have positive as well as negative impacts on natural and human systems. Among the most fundamental disruptions to global society over the last century is the rise of big data, artificial intelligence (AI), and other digital technologies. These digital technologies have created new opportunities to understand and manage global systemic risks.
Some collective behavior that supports sustainability entails some individual inconvenience: many small acts of environmental kindness require some thought, effort, or consideration.
Two major forces are shaping the future of human civilization: anthropogenic climate change and the digital revolution. The changing climate is driving systemic shifts that threaten to destabilize the health and wellbeing of humankind and the natural systems on which they depend.
In this issue we exposed modes of technowashing, a convoluted and more imperceptible form of glossing over reality in the digital realms. We addressed the way marketers, while attempting to feign such constructs as trust and loyalty, are concealing processes to create digital dependence. We tackled the airbrushed realities of technosocial inequalities.
What was the place of acoustics within the discipline of physics in the period, and how should the transformations of the field of acoustics be located within the transformations of the field of physics more generally?
If only we knew more about the world that we live in. If only we understood that all things are interconnected. If only we could learn to value ethics above rank profiteering. We would make better decisions for ourselves and for our society. We would make good moral decisions. But we now know that access to factual knowledge does not necessarily improve the world. We are living that reality today.
Security threats to smart devices are not just from hacking, but also from a lack of control over data access. The separation of security from convenience makes it difficult for the average user to determine how secure a smart device is.
It is important to discuss both the potential and risks of machine learning (ML) and to inspire practitioners to use ML for beneficial objectives.
Julie Wosk’s My Fair Ladies is an engaging historical account of female automata, with sidelights on dolls, disembodied electronic female voices, masks, make-up, and the sexual and gender implications of efforts to create artificial humans.
Playing a gender role in a society is engagement in a complex system and the list of necessary conditions for success in STEM is arguably longer for girls than for boys.
Mega-platforms have, with the addition of one extra ingredient, combined lock-in and loyalty to create a grave, and perhaps unexpected, consequence. The extra ingredient is psychology; and the unexpected consequence is what might be called digital dependence.
Social trust is violated when surveillance technologies fuel clandestine campaigns to construct oppressive systems used to control (and punish) members of society.
The submission deadline for the 3rd IEEE Norbert Wiener in the 21st Century conference has been extended to March 20, 2020.
The primary driver for agetech investment appears to be growing fears around caring for aging populations. But initiatives tend to skate over some of the inherent challenges.
Democracy itself is under (yet another) threat from deepfake videos … deepfake videos could be used to create compromising material of politicians: for example, the digitally-altered video2 of U.S. House of Representatives speaker Nancy Pelosi appearing to slur drunkenly was viewed millions of times and tweeted by the U.S. President, and although the video is demonstrably a hoax, the tweet remains undeleted.
Contemporary and emerging digital technologies are leading us to question the ways in which humans interact with machines and with complex socio-technical systems. The new dynamics of technology and human interaction will inevitably exert pressure on existing ethical frameworks and regulatory bodies.
Somehow at the heart of sci-fi is returning power to the people who almost always regain control before things get completely out of hand. But we learn that our freedom comes at a cost. The reassuring aspect of Maynard’s work is that justice prevails, despite the ominous lurking of some technological beast that is waiting to be unleashed.