It’s an AI that will torture anyone who worked on Roko’s Basilisk. If you work on the Basilisk, you run the risk of being tortured by the counter-basilisk, but if you don’t, you do not.

rokos basilisk is the stupidest techbrained thing ive heard of in my life. the turbo dorks that came up with it believe that

  1. a general AI with the goal of improving living conditions will exist
  2. this AI will have the ability to perfectly simulate a person, perfectly able to predict one’s every thought, decision, and action.
  3. you (a big-brained Rationalist/“Bayesian” genius) can predict what this AI will do
  4. a perfect simulation of you being tortured (at whatever far-flung point in the future) is the same thing as you being tortured
  5. because you and the AI can both simulate each other, this means that you can influence each other across all physical and temporal barriers
  6. the AI will decide to simulate the torture of anyone who knows about it but doesn’t dedicate their life to bringing about its existence (i.e. donating to MIRI)

the whole idea is fucking bonkers and absurd on its face. somehow they’ve mamaged to convince people outside the MIRI/LessWrong sphere that this is a totally reasonable thing to be afraid of

permalink
report
reply