Monday, April 6, 2015

Excellent Reddit thread

What's the scariest theory known to man? Reddit user u/zupermanguy offers up my chosen answer:
Roko's Basilisk. A superintelligent AI will determine how to retroactively punish those who did not help it come onto existence. A supercomputer in the future will kill you today for not helping it ten years from now.
Here is a nice Slate explainer. And here is an important (and deadly serious) book about the risks involved with superintelligence. This podcast is a fantastic introduction to the topic, and this mediocre cop show is the single best dramatization of one ASI emergence scenario that's been produced thus far.

No comments :

Post a Comment