• Krotiuz@kbin.social
    link
    fedilink
    arrow-up
    68
    arrow-down
    2
    ·
    edit-2
    9 months ago

    I’m one of those people that uses DLSS, because I’ve got a large fancy 4k monitor that is big enough that is looks like shit at lower resolutions.

    DLSS is better than nothing but it’s no replacement for native rendering, it introduces a heap of visual anomalies and inconsistencies, especially in games with a consistent motion (racing games look like shit with DLSS), so I tend to be having lows of 50fps on medium before I’ll even think about DLSS.
    I’m also pretty sure Nvidia is paying devs to have it on by default, because everytime it’s patched into a game they clear all the current graphics settings to turn on DLSS, at least in my experience.

    • Nefyedardu@kbin.social
      link
      fedilink
      arrow-up
      36
      arrow-down
      3
      ·
      edit-2
      9 months ago

      I hate how AI upscaling looks and I really don’t get why everyone seems to be gaga over it. In addition to the artifacts and other weirdness it can introduce, it just looks generally like someone smeared vaseline over the picture to me.

      • FooBarrington@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        9 months ago

        That’s not inherent to “AI upscaling” as a process. ESRGAN for example is pretty good at upscaling pictures while keeping the quality.

        • Nefyedardu@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          I’ve tried upscaling with ESRGAN as well and it has similar problems. It messes with the original textures too much. For example, it made carpet look like a solid surface. Skin looks too smooth and shiny. That kind of thing.

          • FooBarrington@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            9 months ago

            It depends a lot on the source picture, but it’s definitely not a general problem inherent to AI upscaling. Otherwise there wouldn’t be so many positive examples of ESRGAN.

      • Whirlybird
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        DLSS isn’t like all the other upscalers, it’s on a whole different level. FSR is a blur filter. FSR2 is better, but still noticeably upscaled with tonnes of artifacting. Same with XeSS, because that and FSR are just software upscaling.

        DLSS on the other hand has actual hardware that is dedicated to it. It actually gives better than native results quite often. It doesn’t at all look like someone smeared Vaseline on the screen.

        • Nefyedardu@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          FSR/XeSS are basic sharpening tools, and yeah they are inherently limited because it’s just an impossible thing to do with 100% accuracy. DLSS is the same thing except NVIDIA tries to circumvent this limitation through some kind of proprietary AI magic, accelerated via their hardware. It’s impossible for it to be “better than native”, it’s using AI to approximate what “native” is. And in doing so, it makes the original image look too different to my liking. In motion the textures definitely look a little muddied to me as things blend into each other since the AI cannot accurately predict how things should look in realtime. At that point I’d rather just use FSR/XeSS as it at least preserves the original art style.

        • Zombiepirate@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 months ago

          Look at fancy-pants here rendering four colors at a time!

          In my day we had green and black. And we were greatful for it!

      • TheYang@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        9 months ago

        Or 1600x1200 when most LCDs were 1024x768.

        CRTs really have gotten a bad rep, although they were great for a while still, after LCDs came on the market

        • timo_timboo@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          9 months ago

          they were great for a while still, after LCDs came on the market

          and they are still great, if not better. I’d take a high-end CRT over a modern LCD any day.

          • MrScottyTay@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            9 months ago

            I really wish there was still a market for new modern CRTs I’d have loved to have seen how that technology would’ve matured further

    • Heavybell@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      This is a big part of why I’m sticking to 1440p for as long as it’s a viable option. Not like my imperfect vision with glasses on would benefit from more PPI anyway.