• 14 Posts
  • 58 Comments
Joined 11 months ago
cake
Cake day: July 6th, 2023

help-circle








  • She really is insufferable. If you’ve ever listened to her Pivot podcast (do not advise), you’ll be confronted by the superficiality and banality of her hot takes. Of couse this assumes you’re able to penetrate the word salad she regularly uses to convey any point she’s trying to make. She is not a good verbal communicator.

    Her co-host, “Professor” [*] Scott Galloway, isn’t much better. While more verbally articulate, his dick joke-laden takes are often even more insufferable than Swisher’s. I’m pretty sure Kara sourced from him her opinion that you should “use AI or be run over by progress”; it’s one of his most frequent hot takes. He’s also one of the biggest tech hype maniacs, so of course he’s bought a ticket on the AI hype express. Before the latest AI boom, he was a crypto booster, although he’s totally memory-holed that phase of his life now that the crypto hype train has run off a cliff.

    [*] I put professor in quotes, because he’s one of those people who insist on using a title that is equal parts misleading and pretentious. He doesn’t have a doctorate in anything, and while he’s technically employed by NYU’s business school, he’s a non-tenured “clinical professor”, which is pretty much the same as an adjunct. Nothing against adjunct professors, but most adjuncts I’ve known don’t go around insisting that you call them “professor” in every social interaction. It’s kind of like when Ph.D.s insist you call them “doctor”.



  • TinyTimmyTokyo@awful.systemsOPtoSneerClub@awful.systemsOK doomer
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 months ago

    I’m probably not saying anything you didn’t already know, but Vox’s “Future Perfect” section, of which this article is a part, was explicitly founded as a booster for effective altruism. They’ve also memory-holed the fact that it was funded in large part by FTX. Anything by one of its regular writers (particularly Dylan Matthews or Kelsey Piper) should be mentally filed into the rationalist propaganda folder. I mean, this article throws in an off-hand remark by Scott Alexander as if it’s just taken for granted that he’s some kind of visionary genius.





  • You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.

    I used to be more sanguine about people’s ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.





  • I haven’t read Scott’s comment sections in a long time, so I don’t know if they’re all this bad, but that one is a total dumpster fire. It’s a hive of Trump stans, anti-woke circle-jerkers, scientific racists, and self-proclaimed Motte posters. It certainly reveals the present demographic and political profile of his audience.

    Scott has always tried to hide his reactionary beliefs, but I’ve noticed he’s letting the mask slip a bit more lately.