Second International Workshop on Artificial Intelligence for Equity (AI4Eq) Against Modern Indentured Servitude
PIT acknowledges that technological potential can be harnessed to satisfy the needs of civil society. In other words, technology can be seen as a public good that can benefit all, through an open democratic system of governance, with open data initiatives, open technologies, and open systems/ecosystems designed for the collective good, as defined by respective communities that will be utilizing them.
As the COVID-19 pandemic shows, crises can catalyze socio-technical changes at a speed and scale otherwise thought impossible. Crises expose the fragility and resilience of our sociotechnical systems – from healthcare to financial markets, internet connectivity, and local communities.
Critical thinking is a mainstream part of some educational traditions, but is it universally valued? Only some truths have an objective basis and many others depend on the eye of the beholder. No real society values everyone equally.
IEEE SSIT: Who we are, what we care about, and our history within the IEEE organization.
Systems can be designed using methodologies like value-sensitive design, and operationalized, to produce socio-technical solutions to support or complement policies that address environmental sustainability, social justice, or public health. Such systems are then deployed in order to promote the public interest or enable users to act (individually and at scale) in a way that is in the public interest toward individual and communal empowerment.
The nuclear anxiety of the Cold War now seems quaint. While speculative writers of the late 20th-early 21st centuries have largely relegated nukes to the past, the situation at San Onofre reminds us of our sins — of assuming the future would take care of the future. The U.S. Nuclear Regulatory Commission enabled this consensual hallucination. Did it take climate change into consideration?
If you are an undergraduate student interested in examining the social implications of technology, submit your work for a chance to publish your work and win cash prizes!
Reflective thinking allows humans to examine the past with intentionality, learn from what happened, and adapt accordingly. We explore thoughts, feelings, and actions, mine out insights, and enhance awareness.
The fiercest public health crisis in a century has elicited cooperative courage and sacrifice across the globe. At the same time, the COVID-19 pandemic is producing severe social, economic, political, and ethical divides, within and between nations. It is reshaping how we engage with each other and how we see the world around us. It urges us to think more deeply on many challenging issues—some of which can perhaps offer opportunities if we handle them well. The transcripts that follow speak to the potency and promise of dialogue. They record two in a continuing series of “COVID-19 In Conversations” hosted by Oxford Prospects and Global Development Institute.
Digital discrimination is becoming a serious problem, as more and more decisions are delegated to systems increasingly based on artificial intelligence techniques such as machine learning. Although a significant amount of research has been undertaken from different disciplinary angles to understand this challenge—from computer science to law to sociology— none of these fields have been able to resolve the problem on their own terms. We propose a synergistic approach that allows us to explore bias and discrimination in AI by supplementing technical literature with social, legal, and ethical perspectives.
When we see a built world, we tend to take its permanence and stability for granted. For those who have chosen coastal homes, that built world goes back at least 50 years, with few residents ever realizing that oceans, lakes, and rivers are living entities constantly in motion. The average person relies upon experts such as architects and civil engineers, and supposed guardrails such as state building codes and homeowner associations, to assess safety when purchasing property. But the 21st-century assumption that the built world is stable is a risky bet. Especially in “business-friendly” states.
Arizona continues to build, build, build, and instead of requiring new residents to adapt to the climate, city governments and developers market the very bad idea that the desert can be made green, and thus more desirable.
Unintended consequences of technological development matter in practice and thus are not just of academic interest. SSIT would do well to spark constructive and practical discussion about managing unintended consequences.
The 21st Century Norbert Wiener Conference with the theme: “Being Human in a Global Village” is the third in a series of conferences initiated by the IEEE Society on Social Implications of Technology (SSIT), following events in Boston (2014) and Melbourne (2016).
Just as the “autonomous” in lethal autonomous weapons allows the military to dissemble over responsibility for their effects, there are civilian companies leveraging “AI” to exert control without responsibility.
And so we arrive at “trustworthy AI” because, of course, we are building systems that people should trust and if they don’t it’s their fault, so how can we make them do that, right? Or, we’ve built this amazing “AI” system that can drive your car for you but don’t blame us when it crashes because you should have been paying attention. Or, we built it, sure, but then it learned stuff and it’s not under our control anymore—the world is a complex place.
ISTAS 2021 will be jointly hosted by the University of Waterloo and the University of Guelph (Ontario, Canada) in October 28-31, 2021. Submission Deadline July 13, 2021
The COVID-19 pandemic has exposed and exacerbated existing global inequalities. Whether at the local, national, or international scale, the gap between the privileged and the vulnerable is growing wider, resulting in a broad increase in inequality across all dimensions of society. The disease has strained health systems, social support programs, and the economy as a whole, drawing an ever-widening distinction between those with access to treatment, services, and job opportunities and those without.
We celebrated AI for mental health equity when access is augmented for marginalized populations. We applauded AI as a complement to current services; practitioners would be less overtaxed and more productive, thereby serving vulnerable populations better.
The public’s faith in science and technology has never been higher. Computer “apps” that explore things such as the frequency of, and point of origin of, COVID-related Google search terms, and Twitter posts, are being used to trace the progress of the virus and to predict the sites of further outbreaks. The United States has been roiled by the death, at the hands of the police, of George Floyd. Floyd’s killing was captured by an app that has been circulating throughout the globe that has acquired the near iconic power of the crucifixion. With the majority of the American people equipped to make audio–visual recording of police brutality and post them on social media, we expect that crimes such as this will certainly diminish.