• Album@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 months ago

    Babe it’s not what you think, she’s just statistically correlated to my dick.

    • WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      Everything in the known universe is statistically correlated, if your computer is complex enough.

      If this argument suceeds, then all patent and IP laws are invalid.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    Nvidia… sounds familiar. Maybe I’ve heard of that name. I associate “Nvidia” with thoughts of expensive hardware, stock market bubbles, and annoying driver software. But that’s just a statistical correlation, it’s not like I actually know anything.

  • EleventhHour@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 months ago

    And the media on my plex server is just statistically correlated to popular movies and tv shows.

    🙄

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    3
    ·
    3 months ago

    Well, they are. Is the term “stochastic parrot” no longer popular? That’s what the “stochastic” part means.

  • Just_Pizza_Crust@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    “But I NEEEEEEEEED to play my games on 164x AA, ultra textures, unlimited render distance, no optimization, 900 fps (on a 60 fps 720p monitor) MOOOOOMMMM!!!”

    That’s how I hear every excuse for a recent Nvidia purchase.

    • Prunebutt@slrpnk.netOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      I’m hearing more stuff like “but I nEeEeEeEd the latest nvidia card to create AI ‘art’ and soulles synthetic text locally! It’s for business!”

      • LainTrain@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        Yeah I got a 3090 for Stable Diffusion and LocalLLaMa, money well spent tbh it brought many hours of fun, joy and learning and let me explore and study the tech and ML more generally without performance constraints. Much better value proposition than extra reflections in some video game honestly.

    • corroded@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I have no doubt that the 4090 is a fantastic piece of hardware, but I just don’t see a justification for upgrading.

      I play games on a 4k/60 monitor, generally with close to max graphics settings (obviously within reason). My 2080TI handles that just fine. I also couldn’t care less about framerate unless the game is noticeably stuttering, so that might help.

      • dinckel@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I know a couple of people who bought a 4090, expecting to only play League and Minecraft on it. Genuinely just do not understand the reasoning behind that

      • ripcord@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I’m still using my same 7700k and 1060 and for 1080p stuff it’s just fine, dagnabbit.

        • Plastic_Ramses@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          I mean, i know for a fact that cyberpunk 2077 barely runs on low with that setup.

          It’s totally fine if you just play games like slay the spire or enter the gungeon. But that rig, from personal experience, runs high-end graphics like shit. Hell, even elden ring runs like shit on that setup.

  • Lemminary@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    What? It’s not a blunt, officer. It’s a statistical correlation mirage, go get those eyes checked.