AI takeover  

From The Art and Popular Culture Encyclopedia

Revision as of 10:05, 8 December 2015; view current revision
←Older revision | Newer revision→
Jump to: navigation, search

Related e

Wikipedia
Wiktionary
Shop


Featured:

A global catastrophic risk is a hypothetical future event with the potential to seriously damage human well-being on a global scale. Some events could destroy or cripple modern civilization. A severe event, one that could cause human extinction, is known as an existential risk.

Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, and pandemics.

Researchers experience difficulty in studying human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias.

The concept is expressed in various phrases such as "End of the World", "Doomsday", "Ragnarök", "Judgment Day", "Armageddon", "the Apocalypse", "Yawm al-Qiyāmah" and others.

See also




Unless indicated otherwise, the text in this article is either based on Wikipedia article "AI takeover" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools