Apocalypse Deterrence

By on June 1st, 2015 in Societal Impact

The Center for the Study of Existential Risk (CSER.org) at Cambridge (U.K.) is focusing on how to protect humanity from the downside(s) of technology.  By “Existential” they are not referring to Camus, but to the elimination of Homo Sapiens — i.e. our existence, aka “apocalypse”.

Their concerns include the question of AI*’s that might have both sufficient power and motivation to disrupt humanity, and genetic engineering that could either make us obsolete, or get out of hand and make us extinct.

Who Cares? … well some fairly knowledgeable folks are involved, including:

  • Stephen Hawlking
  • Jaan Tallinn
  • Elon Musk
  • George Church

I suspect that some SSIT folks may find it useful to monitor CSER’s newsletter and consider how their concerns and issues relate to SSIT’s activities. — Grist for the Mill as it were.

Also see the AI apocalypse.  The list of related publications and pundints keeps growing: Al Gore, Yuval Harari, and more recently an initiative focused on AI Ethics by the IEEE Standards Association.

image By Wikimedia Foundation, CC BY-SA 3.0