The New Yorker has a piece on the Bay Area AI doomer and e/acc scenes.

Excerpts:

[Katja] Grace used to work for Eliezer Yudkowsky, a bearded guy with a fedora, a petulant demeanor, and a p(doom) of ninety-nine per cent. Raised in Chicago as an Orthodox Jew, he dropped out of school after eighth grade, taught himself calculus and atheism, started blogging, and, in the early two-thousands, made his way to the Bay Area. His best-known works include “Harry Potter and the Methods of Rationality,” a piece of fan fiction running to more than six hundred thousand words, and “The Sequences,” a gargantuan series of essays about how to sharpen one’s thinking.

[…]

A guest brought up Scott Alexander, one of the scene’s microcelebrities, who is often invoked mononymically. “I assume you read Scott’s post yesterday?” the guest asked [Katja] Grace, referring to an essay about “major AI safety advances,” among other things. “He was truly in top form.”

Grace looked sheepish. “Scott and I are dating,” she said—intermittently, nonexclusively—“but that doesn’t mean I always remember to read his stuff.”

[…]

“The same people cycle between selling AGI utopia and doom,” Timnit Gebru, a former Google computer scientist and now a critic of the industry, told me. “They are all endowed and funded by the tech billionaires who build all the systems we’re supposed to be worried about making us extinct.”

  • gerikson@awful.systems
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 months ago

    I dunno. At least in the US, these people are decidedly outside the mainstream, at least in the US. Their views on religion and sexual mores preclude any popular appeal, and they are handicapped in a similar way were they to try to infiltrate existing power structures.

    Basically their only hope is that an AI under their control takes over the world.

    • Architeuthis@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      ·
      9 months ago

      Basically their only hope is that an AI under their control takes over the world.

      They are pretty dominant in the LLM space and are already having their people fast tracked into positions of influence, while sinking tons of cash into normalizing their views and enforcing their terminology.

      Even though they aren’t trying to pander to religious americans explicitly, their millenialism with the serial numbers filed off worldview will probably feel familiar and cozy to them.

      • gerikson@awful.systems
        link
        fedilink
        English
        arrow-up
        9
        ·
        8 months ago

        You still need to lever that money by “buying” the people in power.

        Right now there’s really no mainstream politicians 100% on board with the weirdness of TESCREALs:

        • mainstream Democrats - too wary of corporations, too eager to regulate
        • pre-Trump GOP - maybe, but they’re losing influence fast
        • current Trump GOP - literally crazy, way too easy for TESCREALs to be painted as a satanic cult
        • lobotomy42@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          ·
          8 months ago

          Maybe. The current EA strategy is to takeover all the technocratic positions in government/business one level down from the ostensible policy-makers. The idea being that if they are the only ones qualified to actually write the reports on “alignment” for DoD/NIST/etc. then ultimately they get to squeeze in some final control over the language, regardless of what Joe Senator wants. Similarly, by monopolizing and brainwashing all the think tank positions, even the Joe Senators out there end up leaning on them to write the bills and executive orders.

        • YouKnowWhoTheFuckIAM@awful.systems
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 months ago

          I’ve finally got around to replying to this but it’s been burning a hole in my subconscious

          I think that’s a naive interpretation of the interests in play here.

          Altman aptly demonstrated that a yes/no on regulations isn’t the money’s goal here, the goal is to control how things get regulated. But at the same time Democrats are hardly “eager to regulate” simpliciter, and the TESCREALs/Silicon Valley can hardly be said to have felt the hammer come down in the past. It may be part of some players’ rhetoric (e.g. Peter Thiel) that the Republicans (both pre- and post-Trump) are their real friends insofar as the Republicans are eager to just throw out corporate regulations entirely, but that’s a different issue: it’s no longer one of whether you can buy influence, it’s a matter of who you choose to buy influence with in the government, or better yet which government you try to put in power.

          It should be noted at this point that mentioning Thiel is hardly out of court, even if he’s not in the LessWrong stream: he shares goals and spaces with big elements of the general TESCREAL stream. He’s put money into Moldbug’s neo-reaction, which is ultimately what puts Nick Land sufficiently on the radar to find his way into Marc Andreesen’s ludicrous manifesto.

          And why should the TESCREALs fear being painted as a satanic cult in the first place? Has that been a problem for anybody but queer people and schoolteachers up to this point? It seems unlikely to me that anyone involved in Open AI or Anthropic is going to just stop spending their absolute oceans of capital for fear that LibsOfTikTok is going to throw the spotlight on them. And why would Raichik do that in the first place? The witch hunters aren’t looking for actual witches, they’re looking for political targets, and I don’t see what’s in it for them in going after some of the wealthiest people on the West Coast except in the most abstract “West Coast elites” fashion, which as we all know is just another way of targeting liberals and queers.