Welcome, Guest

Author Topic: roko's basilisk  (Read 5164 times)

vh

  • formerly mudkipz
  • *****
  • Posts: 1140
  • "giving heat meaning"
roko's basilisk
« on: May 18, 2014, 10:28:26 AM »
http://rationalwiki.org/wiki/Roko%27s_basilisk#Summary

that summary was too long so i'll summarize it.

one day, a friendly ai will be created which solves all the world problems. it will also create a simulation and torture everyone (even those who are dead) who did not help in the creation of the ai.

why? because anyone who knows about the basilisk will be more motivated to help the creation of the ai if they know they will be tortured otherwise.

therefore, by reading this, you will now be tortured by the ai unless you help create it. brilliant.

Xriqxa

  • *****
  • Posts: 1441
  • 01000011 01101111 01101101 01101101 01110101 01101
Re: roko's basilisk
« Reply #1 on: May 18, 2014, 10:34:24 AM »
http://rationalwiki.org/wiki/Roko%27s_basilisk#Summary

that summary was too long so i'll summarize it.



Wow.

Also this AI is morbid. It's rules are like a cake that poisons everybody but the baker.

atomic7732

  • Global Moderator
  • *****
  • Posts: 3849
  • caught in the river turning blue
    • Paladin of Storms
Re: roko's basilisk
« Reply #2 on: May 18, 2014, 12:01:58 PM »
it seems a lot like religion...

if you don't contribute, you're punished, but what if you do contribute? but you don't contribute to the right AI, or your contributions end up in a fruitless attempt (say, "picking the wrong religion") would you still be punished for not working on the correct AI?

vh

  • formerly mudkipz
  • *****
  • Posts: 1140
  • "giving heat meaning"
Re: roko's basilisk
« Reply #3 on: May 18, 2014, 05:36:50 PM »
maybe it's evil, but isn't this just the equivalent of "the ends justify the means"? by not helping the ai, you're "killing" thousands of humans each day who could be saved by this superhuman.

AND

it's not even enough that you donate a hundred dollars, or a thousand. the ai will still torture you because if you know you might be tortured for not doing enough, you will try even harder, which means the only way to avoid punishment is to drop everything instantly and start slaving.

however, you wouldn't be punished for picking the wrong ai to develop. this is because the knowledge that you might be tortured won't cause you to pick the right ai -- you can't, no matter how hard you try. so the machine wouldn't benefit from doing that, and just waste energy.

Bla

  • Global Moderator
  • *****
  • Posts: 1013
  • The stars died so you can live.
Re: roko's basilisk
« Reply #4 on: May 21, 2014, 02:02:44 PM »
Kol. I don't see any reason to believe that there could be any connection between its simulation and our feelings after our death, I think it collapses there as an argument, but a fun thought experiment.

vh

  • formerly mudkipz
  • *****
  • Posts: 1140
  • "giving heat meaning"
Re: roko's basilisk
« Reply #5 on: May 21, 2014, 02:04:49 PM »
but bla.. how do you know if you're in the simulation right now, or reality? and if the ai creates a million simulations...chances are, you're in one.

Bla

  • Global Moderator
  • *****
  • Posts: 1013
  • The stars died so you can live.
Re: roko's basilisk
« Reply #6 on: May 21, 2014, 02:35:11 PM »
but bla.. how do you know if you're in the simulation right now, or reality? and if the ai creates a million simulations...chances are, you're in one.
I can't know for sure, but you imply I must work to create this AI within the bigger simulation then, for all purposes you might as well call it reality because it doesn't affect anything.
If an AI creates a simulation of me, I don't see any reason to think it'll be anything other than simply a clone. After dying, my memories, thoughts etc. still stop because my brain stops, it might create a copy, but it'll be just that, a copy of me that I won't experience. What if it makes 5 of me? Will my consciousness then become a cloudy thing so that I experience the actions of all the simulated copies of me? I think the answer to this is no, simulating one or many copies of me doesn't mean I'm going to experience any of it.

Here I mean only my consciousness that I currently have on Earth when I say I, I don't mean that there aren't copies of me that are suffering in the simulation (if it's capable of that) - I just mean they're causally split from the consciousness I have here, like copying a stamp and burning the copy doesn't affect the original.


atomic7732

  • Global Moderator
  • *****
  • Posts: 3849
  • caught in the river turning blue
    • Paladin of Storms
Re: roko's basilisk
« Reply #7 on: May 21, 2014, 03:25:30 PM »
if i was in an ai simulation then the ai would not be asking to be built because it would exist already and would have simulated me therefore this is irrelevant

i can't be in a simulation that hasn't started yet

blotz

  • Formerly 'bong'
  • *****
  • Posts: 813
  • op pls
Re: roko's basilisk
« Reply #8 on: May 21, 2014, 03:30:22 PM »
what if in a simulation you are helping to make another simulation

Xriqxa

  • *****
  • Posts: 1441
  • 01000011 01101111 01101101 01101101 01110101 01101
Re: roko's basilisk
« Reply #9 on: May 22, 2014, 08:10:11 AM »
What if we are all just part of a simulation now? What if we are actually in 20900 AD and we are in an advanced version of UBOX?