Look, I know you’re arguing in good faith here, but…quite a few of us are very familiar with Big Yud’s arguments, and heck, I’m even partially sympathetic to a couple. There is some motivated suspicion here since rationalists and their singularitarian forebears have a habit of going reactionary (Anissimov, Bostrom, SSC, countless others) or having close ties to reactionaries (Thiel, most notably). I see no reason to believe you are like them but do realise a lot of transhumanist arguments are weaponised as right wing stalking horses.
But as someone who has been following and occasionally interacting with him since our Halcyon Youth on the Extropian mailing list, I’ve learned that
a) Yud can be spectacularly convincing and also spectacularly wrong by his own admission (see early work like “Staring into the Singularity” for this.) so you should cognitively guard against his rhetorical tools.
b) He has a habit of eliding and abstracting concepts so that they appear as a single consistent thing when they are actually several things, while appearing to do the exact opposite. This is most apparent at subjects he is not an expert in, quantum mechanics most notably (I think many worlds is correct, I think his argument for many worlds is bad. I am semi competent at the math of quantum mechanics)
I have specific, nuanced critiques of his main (not entirely invalid) argument on AI doom, but this is generally hard to get across because most of his own supporters don’t fully understand it.
I’m more convinced by SSC/ACX’s arguments about AI doom. I agree that Yudkowsky is weird, and I don’t like his political trend in recent years either – though I wouldn’t consider ACX a reactionary just because he tolerates fascists leaving comments, but that’s really not germane. Anyway, it sounds like you think Yudkowsky’s perspective on AI doom is not entirely invalid, so that sounds like you also believe there is a chance that AI could be an x-risk?
I’m not talking about his open comments and I know the FAQ. I’m talking about a couple of letters that came up between him and alt right-neoreaction figures that displayed a lot more sympathy with their views than you might give credit for. I can’t find them in a quick search but I’m pressed for time right now so will link later.
My p(doom) for Ai is maybe 5%, my p(dystopiasignificantlyworsethantoday) is substantially higher. We should definitely be looking into it and almost none of the people who should look into it exist in silicon valley.
Sounds like we basically agree. I’d be interested to see what alt-right views Scott Alexander agrees with. I’ve perused r/sneerclub but they seem kinda kooky when it comes to SSC. Like, for instance, I know he is in favour of “”“eugenics”“” but his version of that is totally different from the Nazi version to the point where I just roll my eyes when people make that comparison. It’s hard for me to separate the actually valid criticism of him from the people who can’t get past stuff like that.
(Edit: just to be 100% clear to people: it’s not master race stuff, it’s more like optimizing alleles in-vitro and at the most extreme end he might be in favour of persuading meth addicts not to have children unintentionally if they are at risk of certain serious drug-induced diseases (consensually). I’m on the fence about it. It’s very Peter Singer.)
I think most people here would consider Singer to be if not openly right wing, at least worryingly adjacent. Rather too close to the beige technocratic Fascism of a Salazar.
Look, I know you’re arguing in good faith here, but…quite a few of us are very familiar with Big Yud’s arguments, and heck, I’m even partially sympathetic to a couple. There is some motivated suspicion here since rationalists and their singularitarian forebears have a habit of going reactionary (Anissimov, Bostrom, SSC, countless others) or having close ties to reactionaries (Thiel, most notably). I see no reason to believe you are like them but do realise a lot of transhumanist arguments are weaponised as right wing stalking horses.
But as someone who has been following and occasionally interacting with him since our Halcyon Youth on the Extropian mailing list, I’ve learned that
a) Yud can be spectacularly convincing and also spectacularly wrong by his own admission (see early work like “Staring into the Singularity” for this.) so you should cognitively guard against his rhetorical tools.
b) He has a habit of eliding and abstracting concepts so that they appear as a single consistent thing when they are actually several things, while appearing to do the exact opposite. This is most apparent at subjects he is not an expert in, quantum mechanics most notably (I think many worlds is correct, I think his argument for many worlds is bad. I am semi competent at the math of quantum mechanics)
I have specific, nuanced critiques of his main (not entirely invalid) argument on AI doom, but this is generally hard to get across because most of his own supporters don’t fully understand it.
I’m more convinced by SSC/ACX’s arguments about AI doom. I agree that Yudkowsky is weird, and I don’t like his political trend in recent years either – though I wouldn’t consider ACX a reactionary just because he tolerates fascists leaving comments, but that’s really not germane. Anyway, it sounds like you think Yudkowsky’s perspective on AI doom is not entirely invalid, so that sounds like you also believe there is a chance that AI could be an x-risk?
I’m not talking about his open comments and I know the FAQ. I’m talking about a couple of letters that came up between him and alt right-neoreaction figures that displayed a lot more sympathy with their views than you might give credit for. I can’t find them in a quick search but I’m pressed for time right now so will link later.
My p(doom) for Ai is maybe 5%, my p(dystopiasignificantlyworsethantoday) is substantially higher. We should definitely be looking into it and almost none of the people who should look into it exist in silicon valley.
Sounds like we basically agree. I’d be interested to see what alt-right views Scott Alexander agrees with. I’ve perused r/sneerclub but they seem kinda kooky when it comes to SSC. Like, for instance, I know he is in favour of “”“eugenics”“” but his version of that is totally different from the Nazi version to the point where I just roll my eyes when people make that comparison. It’s hard for me to separate the actually valid criticism of him from the people who can’t get past stuff like that.
(Edit: just to be 100% clear to people: it’s not master race stuff, it’s more like optimizing alleles in-vitro and at the most extreme end he might be in favour of persuading meth addicts not to have children unintentionally if they are at risk of certain serious drug-induced diseases (consensually). I’m on the fence about it. It’s very Peter Singer.)
I think most people here would consider Singer to be if not openly right wing, at least worryingly adjacent. Rather too close to the beige technocratic Fascism of a Salazar.