Roko's Basilisk

Joined
Apr 29, 2015
Messages
5,840
Rokos basilisk never, ever, made any sense. It's not an original idea or concept, it's just existential angst and the need for leadership applied to computers

Singularity Sky by Charles Stross is what those pseudo intellectuals should have read about a Super -AI from the future intervening in its past to make sure it will come into being.

Well, I don’t know. It does seem an original take on Pascal’s Wager. I haven’t read Singularity Sky, but the angst you speak of might more plausibly (than time travel back into the past, which is what you seem to be implying in your comment) arise out of imagining that we’re all simulations within the future-AI’s reward-retribution refashioning/recreation of our world, and so headed, many of us, towards (what will feel like) an eternity of hell.

Of course it doesn’t hold up. First, because once that future AI has come into being, then it will have no need, any more, to make good on that threat. And two, because even if it all did add up, even so, where’s the effing evidence? It’s at best a garage dragon.

But the point is: I was under the firm impression (that I’m happy to update/change if I’m wrong about this) that present day AI could not have come up with what Roko came up with on its own steam; and also that it would not be able to critique this idea on its own steam, if no one else, no human, had ever thought or spoken about it. (So that, coming from there, whether or not Roko lifted it off of some SF somewhere is kind of irrelevant, because that SF writer is then the guy that came up with the original idea: and the question becomes, might present day AI be able to do what that writer did and come up with this idea on its own steam, as well as critique it on its own steam?)


A discussion on Roko’s basilisk itself will be completely OT in that thread. But if you’d like to discuss it further, then here’s a separate thread I just started specifically for this. While obviously it doesn't hold up, but I don't think we can dismiss it as not original, as you do, or as not making any kind of sense at all.
 
Last edited:
it's just Skynet plus torture fantasies.
not original.

I guess a correctly prompted LLM fed with the Terminator plot could come up with it.
 
Last edited:
it's just Skynet plus torture fantasies.
not original.

Not really? I haven't gone back and read the actual Less Wrong thread, but I've seen two accounts of it, one's about time travel back, which is crazy, but the other's about, like I said, re-creation of our world and of us, with the express purpose of delivering eternal hell to (some of) us. That's more plausible sounding, I think (although, no, like I said it doesn't actually hold up). And that's not like Skynet really, nowhere close.


I guess a correctly prompted LLM fed with the Terminator plot could come up with it.

Well if you say so. Not the Terminator prompt part specifically, but the part about LLM being able to come up with it. Like I said, I was under the impression that's not the case, but I'm happy to defer to actually informed view that's different than mine, mine's just a general impression about AI is all.
 
Last edited:
So there's a name for the idea that an evil AI that would for some reason want to punish people who didn't help create it? Yawn. Sure, yeah, I guess an AI might think that, just like how some human might decide to punish people who didn't have sex to make babies or whatever. I don't see the significance, except as a 'hey, wouldn't it be terrible if' fantasy. I think it was used as an argument to work on AI technology?
 
So there's a name for the idea that an evil AI that would for some reason want to punish people who didn't help create it? Yawn. Sure, yeah, I guess an AI might think that, just like how some human might decide to punish people who didn't have sex to make babies or whatever. I don't see the significance, except as a 'hey, wouldn't it be terrible if' fantasy. I think it was used as an argument to work on AI technology?

No clue what the larger argument was, or if there was indeed any larger significance, in that Less Wrong thread/post. I think it was more an exploration of the basilisk idea, a mind worm whose whole point is that it's uncommonly unpleasant. And it did work too, it apparently left some people with nightmares about being in a simulation, all set for eternity in hell. Guess they didn't think it through, but rushed off midway of figuring it way to go have their nightmares.
 

Back
Top Bottom