rational enlightened beings that think the terminator from the movies is real i-cant

  • CthulhusIntern [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 hours ago

    Roko’s Basilisk only is scary if you subscribe to their version of utilitarianism, which is purist, but also is a weird zero-sum version. Like, one of them wrote an essay that, if you could torture someone for 50 years, and that would make nobody ever have dust in their eyes again, you should torture the guy, because if you quantify the suffering of the guy, it’s still less than the amount of suffering every subsequent person will feel from having dust in their eyes.

    But also, even if you do subscribe to that, it doesn’t make sense, because in this hypothetical, the Basilisk has already been created, so torturing everyone would serve no utilitarian purpose whatsoever.

  • capitanazo [ey/em]@hexbear.net
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 hours ago

    AI is being trained with the data of everyone on the internet, so we are all helping it to be created. Socko’s floppy disk debunked!!!

  • turtlegreen [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 hours ago

    I’ve thought a lot about these self-declared rationalists, and at the end of the day I think the most important thing that explains them and that people should understand about them, is that they don’t include confidence intervals in their analyses - which means they don’t account for compounded uncertainty. Their extrapolations are therefore pure noise. It’s all nonsense.

    In some ways I think this is obvious: These post-hoc rationalists are the only ones who think emotions can be ignored. For most people that seems to be a clearly an outrageous proposition. How many steps ahead can one reliably predict if they cannot even properly characterize the relevant actors? Not very far. It doesn’t matter whether you think emotions are causative or reactive, you can’t simply ignore them and think that you’re going to be able to see the full picture.

    It’s also worth noting that their process is the antithesis of science. The modern philosophy of science views science as a relativist construct. These SV modernists do not treat their axioms as relative, they believe that each observation is an immutable truth. In science you have to build a bridge if you want to connect two ideas, while in SV rationalism you are basically allowed to transpose any idea into any other field or context without any connection or support except your own biases.

    People equate Roko’s Basilisk to Pascal’s Wager, but afaik Pascal’s game involved the acceptance or denial of a single omnipotent deity. If we accept the premise of Roko’s Basilisk then we are not considering a monotheistic state, we are considering a potentially polytheistic reality with an indeterminate/variable number of godlike entities, and we should not fear Roko’s Basilisk because there are potentially near-infinite “deities” who should be feared even more than the Basilisk. In this context, reacting to the known terror is just as likely to be minimally optimal as maximally optimal. To use their own jargon a little bit, opportunity is therefore maximally utilized by not reacting to any deities that don’t exist today and won’t exist by the end of next week. To do otherwise is to court sunk costs.

    As opposed to things like climate change that exist today and will be worse by next week, but which are almost entirely ignored by this group.

    I’ve been trying to understand the diversity among SV rationalist groups - the basilisk was originally banned, the ziz crew earnestly believed in principles that were just virtue signaling for everyone else, etc. I’ve always seen them as right-wingers, but could some segment be primed for the left? To consider this I had to focus on their flavor of accelerationism: These people practice an accelerationism that weighs future (possible) populations as more important than the present (presumably) smaller population. On this basis they practice EA and such, believing that if they gain power that they will be able to steer society in a direction that benefits the future at the expense of the present. (This being the opposite approach of some other accelerationists who want to tear power down instead of capturing it.) Upon distilling all of the different varieties of SV rationalism I’ve encountered, in essence it seems they believe they must out-right the right so that they can one day do things that aren’t necessarily associated with the right. In my opinion, one cannot create change by playing a captured game. The only way I see to make useful allies out of any of these groups is to convince them that their flavor of accelerationism is self-defeating, and that progress must always be pursued directly and immediately. Which is not easy because these are notoriously stubborn individuals. They built a whole religion around post-hoc rationalization, after all.

  • FnordPrefect [comrade/them, he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    38
    ·
    edit-2
    6 hours ago

    The only thing scary about this shit is that the Rationalists™ have convinced people that the Rational™ thing to do with a seemingly unstoppable megatrocity torture machine is to help and appease it stalin-stressed

    • Esoteir [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      28
      ·
      edit-2
      5 hours ago

      they turned the end of history into a cringe warhammer 40k god because it’s the only two things they know: liberalism and gaming

  • jsomae
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    edit-2
    4 hours ago

    Rationalists understand that just because it first appeared in fiction doesn’t mean it can’t happen for real.

    As for Roko’s Basilisk, it’s a thought experiment and hardly anyone takes it seriously in its literal form. But y’know Nazism is also a Roko’s Basilisk – they punish you for opposing it in its infancy and reward you if you help it.

    • Mardoniush [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      3 hours ago

      Not true, plenty of people who supported the Nazis early on got shanked. Plenty of SPD members and Prussian aristocrats who jumped ship in 1933 prospered and achieved high rank. Until the Red Army rolled into town, anyway.

  • doublepepperoni [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    47
    ·
    edit-2
    6 hours ago

    In Terminator, Skynet sends back an Austrian robot through time to shoot you in the face

    Roko’s Basilisk creates a virtual version of you at some point in the future after your death that it then tortures for eternity… and you’re supposed to piss your pants over the fate of this Metaverse NPC for some reason

  • WhyEssEff [she/her]@hexbear.netOP
    link
    fedilink
    English
    arrow-up
    41
    ·
    edit-2
    6 hours ago

    as Brace put it and I had to rewrite out of my post because he said it after I thought it, this is people who believe they are the modern equivalent to Socrates falling for chain emails yud-rational

  • AcidSmiley [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    26
    ·
    6 hours ago

    Pascal’s Wager, but if you do not develop skynet robo Jesus sends Arnie back in time to terminate you

    I wonder if these dorks now have sleepless nights wondering if robo Jesus will actually develop out of DeepSeek and send them to robot superhell for being OpenAI heretics.

    • PKMKII [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      23
      ·
      6 hours ago

      That’s the richest irony here, the rationalists are almost all atheists or agnostic’s that scoff at arguments like Pascal’s wager but then have backed themselves into that exact logic.

  • dannoffs [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    6 hours ago

    Imagine a powerful omniscient future AI that reincarnates your consciousness if you believe in rokos basilisk and gives you swirlies forever.