rational enlightened beings that think the terminator from the movies is real
Roko’s Basilisk only is scary if you subscribe to their version of utilitarianism, which is purist, but also is a weird zero-sum version. Like, one of them wrote an essay that, if you could torture someone for 50 years, and that would make nobody ever have dust in their eyes again, you should torture the guy, because if you quantify the suffering of the guy, it’s still less than the amount of suffering every subsequent person will feel from having dust in their eyes.
But also, even if you do subscribe to that, it doesn’t make sense, because in this hypothetical, the Basilisk has already been created, so torturing everyone would serve no utilitarian purpose whatsoever.
AI is being trained with the data of everyone on the internet, so we are all helping it to be created. Socko’s floppy disk debunked!!!
I’ve thought a lot about these self-declared rationalists, and at the end of the day I think the most important thing that explains them and that people should understand about them, is that they don’t include confidence intervals in their analyses - which means they don’t account for compounded uncertainty. Their extrapolations are therefore pure noise. It’s all nonsense.
In some ways I think this is obvious: These post-hoc rationalists are the only ones who think emotions can be ignored. For most people that seems to be a clearly an outrageous proposition. How many steps ahead can one reliably predict if they cannot even properly characterize the relevant actors? Not very far. It doesn’t matter whether you think emotions are causative or reactive, you can’t simply ignore them and think that you’re going to be able to see the full picture.
It’s also worth noting that their process is the antithesis of science. The modern philosophy of science views science as a relativist construct. These SV modernists do not treat their axioms as relative, they believe that each observation is an immutable truth. In science you have to build a bridge if you want to connect two ideas, while in SV rationalism you are basically allowed to transpose any idea into any other field or context without any connection or support except your own biases.
People equate Roko’s Basilisk to Pascal’s Wager, but afaik Pascal’s game involved the acceptance or denial of a single omnipotent deity. If we accept the premise of Roko’s Basilisk then we are not considering a monotheistic state, we are considering a potentially polytheistic reality with an indeterminate/variable number of godlike entities, and we should not fear Roko’s Basilisk because there are potentially near-infinite “deities” who should be feared even more than the Basilisk. In this context, reacting to the known terror is just as likely to be minimally optimal as maximally optimal. To use their own jargon a little bit, opportunity is therefore maximally utilized by not reacting to any deities that don’t exist today and won’t exist by the end of next week. To do otherwise is to court sunk costs.
As opposed to things like climate change that exist today and will be worse by next week, but which are almost entirely ignored by this group.
I’ve been trying to understand the diversity among SV rationalist groups - the basilisk was originally banned, the ziz crew earnestly believed in principles that were just virtue signaling for everyone else, etc. I’ve always seen them as right-wingers, but could some segment be primed for the left? To consider this I had to focus on their flavor of accelerationism: These people practice an accelerationism that weighs future (possible) populations as more important than the present (presumably) smaller population. On this basis they practice EA and such, believing that if they gain power that they will be able to steer society in a direction that benefits the future at the expense of the present. (This being the opposite approach of some other accelerationists who want to tear power down instead of capturing it.) Upon distilling all of the different varieties of SV rationalism I’ve encountered, in essence it seems they believe they must out-right the right so that they can one day do things that aren’t necessarily associated with the right. In my opinion, one cannot create change by playing a captured game. The only way I see to make useful allies out of any of these groups is to convince them that their flavor of accelerationism is self-defeating, and that progress must always be pursued directly and immediately. Which is not easy because these are notoriously stubborn individuals. They built a whole religion around post-hoc rationalization, after all.
I always said Rokos Basilisk would be ACTUALLY cool if it really was just a giant snake.
Giant snakes are awesome.
“Oh you’re a Rationalist? Name 3 different Kant books”
it’s Pascal’s Wager for rationalists
Pascal’s Wager crossed with The Game thought out by someone who heard the synopsis of I Have No Mouth But I Must Scream.
It’s worse
Literally what I said when I first heard of it.
“Um guys is Jeff the Killer real?” for people who use the term “age of consent tyranny”
The only thing scary about this shit is that the Rationalists™ have convinced people that the Rational™ thing to do with a seemingly unstoppable megatrocity torture machine is to help and appease it
they turned the end of history into a cringe warhammer 40k god because it’s the only two things they know: liberalism and gaming
Rationalists understand that just because it first appeared in fiction doesn’t mean it can’t happen for real.
As for Roko’s Basilisk, it’s a thought experiment and hardly anyone takes it seriously in its literal form. But y’know Nazism is also a Roko’s Basilisk – they punish you for opposing it in its infancy and reward you if you help it.
Not true, plenty of people who supported the Nazis early on got shanked. Plenty of SPD members and Prussian aristocrats who jumped ship in 1933 prospered and achieved high rank. Until the Red Army rolled into town, anyway.
In Terminator, Skynet sends back an Austrian robot through time to shoot you in the face
Roko’s Basilisk creates a virtual version of you at some point in the future after your death that it then tortures for eternity… and you’re supposed to piss your pants over the fate of this Metaverse NPC for some reason
damn that sucks for the virtual version of me but idgaf lmao
woooOOooo no but you see YOU are the metaverse NPC inside of the internet RIGHT NOW and the AI is simulating your entire life from the beginning to see if you want to bone an LLM and if you dont it puts you in the noneuclidean volcel waterboarding dungeon woooOOooo
So wait, it’s just I Have No Mouth, and I Must Scream? Did they bother to understand why AM was angry, as spelled out in the story?
Some people need to read more SCPs about cognitohazards
as Brace put it and I had to rewrite out of my post because he said it after I thought it, this is people who believe they are the modern equivalent to Socrates falling for chain emails
Roko’s “I pulled this out of my ass”-alisk
It’s literally just Pascal’s Wager but with an “AI” paint job and some racing stripes.
Pascal’s wager is about a plane of existence that is entirely unknowable, at least.
I mean, I’d say a hypothetical AI occurring at some unknown time in the future, which then remakes your exact brain architecture after you long since died just stuff you in a locker and pants you for all eternity is pretty unknowable as well. ;p
Pascal’s Wager, but if you do not develop skynet robo Jesus sends Arnie back in time to terminate you
I wonder if these dorks now have sleepless nights wondering if robo Jesus will actually develop out of DeepSeek and send them to robot superhell for being OpenAI heretics.
That’s the richest irony here, the rationalists are almost all atheists or agnostic’s that scoff at arguments like Pascal’s wager but then have backed themselves into that exact logic.
Wow, subtweeting about Trueanon Episode 434? I listened to it and damn was it a banger in its 2 hours… yeah, the ridiculousness of the Rationality cult rlly deserves its mockery.
Imagine a powerful omniscient future AI that reincarnates your consciousness if you believe in rokos basilisk and gives you swirlies forever.