Disruptions can have positive as well as negative impacts on natural and human systems. Among the most fundamental disruptions to global society over the last century is the rise of big data, artificial intelligence (AI), and other digital technologies. These digital technologies have created new opportunities to understand and manage global systemic risks.
Some collective behavior that supports sustainability entails some individual inconvenience: many small acts of environmental kindness require some thought, effort, or consideration.
Two major forces are shaping the future of human civilization: anthropogenic climate change and the digital revolution. The changing climate is driving systemic shifts that threaten to destabilize the health and wellbeing of humankind and the natural systems on which they depend.
What was the place of acoustics within the discipline of physics in the period, and how should the transformations of the field of acoustics be located within the transformations of the field of physics more generally?
Security threats to smart devices are not just from hacking, but also from a lack of control over data access. The separation of security from convenience makes it difficult for the average user to determine how secure a smart device is.
It is important to discuss both the potential and risks of machine learning (ML) and to inspire practitioners to use ML for beneficial objectives.
Julie Wosk’s My Fair Ladies is an engaging historical account of female automata, with sidelights on dolls, disembodied electronic female voices, masks, make-up, and the sexual and gender implications of efforts to create artificial humans.
Playing a gender role in a society is engagement in a complex system and the list of necessary conditions for success in STEM is arguably longer for girls than for boys.
Mega-platforms have, with the addition of one extra ingredient, combined lock-in and loyalty to create a grave, and perhaps unexpected, consequence. The extra ingredient is psychology; and the unexpected consequence is what might be called digital dependence.
Social trust is violated when surveillance technologies fuel clandestine campaigns to construct oppressive systems used to control (and punish) members of society.
The submission deadline for the 3rd IEEE Norbert Wiener in the 21st Century conference has been extended to March 20, 2020.
Democracy itself is under (yet another) threat from deepfake videos … deepfake videos could be used to create compromising material of politicians: for example, the digitally-altered video2 of U.S. House of Representatives speaker Nancy Pelosi appearing to slur drunkenly was viewed millions of times and tweeted by the U.S. President, and although the video is demonstrably a hoax, the tweet remains undeleted.
Contemporary and emerging digital technologies are leading us to question the ways in which humans interact with machines and with complex socio-technical systems. The new dynamics of technology and human interaction will inevitably exert pressure on existing ethical frameworks and regulatory bodies.
Somehow at the heart of sci-fi is returning power to the people who almost always regain control before things get completely out of hand. But we learn that our freedom comes at a cost. The reassuring aspect of Maynard’s work is that justice prevails, despite the ominous lurking of some technological beast that is waiting to be unleashed.
Social media have been seen to accelerate the spread of negative content such as disinformation and hate speech, often unleashing a reckless herd mentality within networks, further aggravated by malicious entities using bots for amplification. So far, the response to this emerging global crisis has centered around social media platform companies making reactive moves
As technology pervades all aspects of our existence, and Artificial Intelligence and machine learning systems become commonplace, a new era of human-computer interaction is emerging that will involve directing our focus beyond traditional approaches, to span other intricate interactions with computer-based systems.
It is important to define autonomy in technology, which is not the same as automation. Automated systems operate by clear repeatable rules based on unambiguous sensed data. Autonomous systems take in data about the unstructured world around them, process that data to generate information, and generate alternatives and make decisions in the face of uncertainty.
Mann and Toles crystallize for us climate change denialism, principally in the United States, over the last generation. The core of this denial results from the confluence of several trends deeply embedded in the American culture.
Technology for Big Data, and its brother-in-arms Machine Learning, is at the root of, and is the facilitator of, deliberate string-pulling design choices. These design choices are made by people, and so the question actually becomes, do the design choices enabled by Big Data and Machine Learning have the capacity to alter, diminish and perhaps actually “destroy” what it means to be fundamentally human.
Our authors identified risks that can result in diminished humanity, if technology is designed or delivered irresponsibly. Our community addressed much of what it means to be human, in the context of complex and converging processes.