Public Interest Technology (PIT) is defined as “technology practitioners who focus on social justice, the common good, and/or the public… Read More

Public Interest Technology (PIT) is defined as “technology practitioners who focus on social justice, the common good, and/or the public… Read More
Albright’s book focuses on a group of Americans who live a life of digital hyper-connectivity. Mostly under age 50, this would include what are called Generation X (born between 1965 and 1979), Millennials (born between 1980 and 1999), and their offspring — some, as we have seen, still infants.
Contemporary circumstances in the United States, both in broader politics, recent protest movements around police brutality, and in the demographics of engineering education, have prompted us to look for new ways to bring theory on gender, race, and class to audiences who would not normally consider it their usual reading.
Technological determinism is a myth; there are always underlying economic motivations for emergence of new technologies. The idea that technology leads development is not necessarily true, for example, con-sider AI. It has been a topic of inter-est to researchers for decades, but only recently has the funding caught up, matching the motivation and enabling the development of AI-ori-ented technologies to really take off.
Why are all of these nations and their assorted consortia heading to Mars? Are they truly exploring to improve the human condition, to expand and share scientific knowledge?
Ethical diversity refers to “diverse beliefs … as to what are the most ethically appropriate or inappropriate courses of actions,” and takes into account the different values and beliefs people hold [2]. This diversity is and has always been a source of confusion and conflict, from the personal to the international. The answer, however, is to have forums to debate and discuss the ethical choices embedded in everyday life, not algorithms that render the choice being made invisible.
With techno-feudalism, what is paid and permitted in a digital space is decided by asymmetric power, not mutual consent. Political approval for funding priorities, education programs and regulation all favor Big Tech.
Will We Make Our Numbers? The year 2020 has a majority of the planet asking the simple question: “How do we stay alive? Competition is not working for the long-term sustainability of human and environmental well-being.
With more than 50% of the global population living in non-democratic states, and keeping in mind the disturbing trend to authoritarianism of populist leaders in supposedly democratic countries, it is easy to think of dystopian scenarios about the destructive potentials of digitalization and AI for the future of freedom, privacy, and human rights. But AI and digital innovations could also be enablers of a Renewed Humanism in the Digital Age.
In 2019, millions of young people took to the streets demanding “systems change not climate change.” Their call echoes the words of the Intergovernmental Panel on Climate Change (IPCC) Special Report, which stated that “Limiting global warming to 1.5 °C would require rapid, far-reaching and unprecedented changes in all aspects of society.”
Some collective behavior that supports sustainability entails some individual inconvenience: many small acts of environmental kindness require some thought, effort, or consideration.
It is important to discuss both the potential and risks of machine learning (ML) and to inspire practitioners to use ML for beneficial objectives.
Julie Wosk’s My Fair Ladies is an engaging historical account of female automata, with sidelights on dolls, disembodied electronic female voices, masks, make-up, and the sexual and gender implications of efforts to create artificial humans.
Playing a gender role in a society is engagement in a complex system and the list of necessary conditions for success in STEM is arguably longer for girls than for boys.
Mega-platforms have, with the addition of one extra ingredient, combined lock-in and loyalty to create a grave, and perhaps unexpected, consequence. The extra ingredient is psychology; and the unexpected consequence is what might be called digital dependence.
The primary driver for agetech investment appears to be growing fears around caring for aging populations. But initiatives tend to skate over some of the inherent challenges.
Democracy itself is under (yet another) threat from deepfake videos … deepfake videos could be used to create compromising material of politicians: for example, the digitally-altered video2 of U.S. House of Representatives speaker Nancy Pelosi appearing to slur drunkenly was viewed millions of times and tweeted by the U.S. President, and although the video is demonstrably a hoax, the tweet remains undeleted.
Somehow at the heart of sci-fi is returning power to the people who almost always regain control before things get completely out of hand. But we learn that our freedom comes at a cost. The reassuring aspect of Maynard’s work is that justice prevails, despite the ominous lurking of some technological beast that is waiting to be unleashed.
Social media have been seen to accelerate the spread of negative content such as disinformation and hate speech, often unleashing a reckless herd mentality within networks, further aggravated by malicious entities using bots for amplification. So far, the response to this emerging global crisis has centered around social media platform companies making reactive moves
Technology for Big Data, and its brother-in-arms Machine Learning, is at the root of, and is the facilitator of, deliberate string-pulling design choices. These design choices are made by people, and so the question actually becomes, do the design choices enabled by Big Data and Machine Learning have the capacity to alter, diminish and perhaps actually “destroy” what it means to be fundamentally human.