• PenguinTD@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    8
    ·
    1 year ago

    I don’t get this “raw pixels are the best pixels” sentiment come from, judging from the thread everyone has their own opinion but didn’t actually see the reason behind why people doing the upscalers. Well bad news for you, games have been using virtual pixels for all kinds of effects for ages. Your TV getting broadcast also using upscalers.(4k broadcast not that popular yet.)

    I play Rocket Leauge with FSR from 1440p to 2160p and it’s practically looking the same to 2160p native AND it feels more visually pleasing as the upscale also serve as extra filter for AA to smooth out and sharpen “at the same time”. Frame rate is pretty important for older upscaler tech(or feature like distance field AO), as many tech relies information from previous frame(s) as well.

    Traditionally, the render engine do the stupid way when we have more powerful GPU than engine demand where the engine allows you to render something like 4x resolution then downscale for AA, like sure it looks nice and sharp BUT it’s a bruteforce and stupid way to approach it and many follow up AA tech prove more useful for gamedev, upscaler tech is the same. It’s not intended for you to render 320x240 then upscale all the way to 4k or 8k, it will pave way for better post processing features or lighting tech like lumen or raytracing/pathtracing to actually become usable in game with decent “final output”.(remember the PS4 Pro checkboard 4k, that was a really decent and genuinely good tech to overcome PS4 Pro’s hardware limit for more quality demanding games. )

    In the end, consumer vote with their wallet for nicer looking games all the time, that’s what drives developers gear toward photo real/feature film quality renderings. There are still plenty studio gears toward stylized, or pixel art and everyone flip their shit and praise while those tech mostly relies on the underlying hardware advance pushed by photo real approach, they just use the same pipeline but their way to reach their desired look, Octopath Traveler II used Unreal Engine.

    Game rendering is always about trade-offs, we’ve come a LONG way and will keep pushing boundaries, would upscaler tech become obsolete somewhere down the road? I have no idea, maybe AI can generate everything at native pixels, right?

    • miss_brainfart@lemmy.ml
      link
      fedilink
      arrow-up
      14
      ·
      edit-2
      1 year ago

      I don’t have anything against upscaling per se, in fact I am surprised at how good FSR 2 can look even at 1080p. (And FSR is open source, at least. I can happily try it on my GTX 970)

      What I hate about it is how Nvidia uses it as a tool to price gouge harder than they’ve ever done.

      • NineSwords@lemmy.ml
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        To me, FSR2 always looks like shit. I use it when playing on my SD or Ally and the results always look horrible.

        • miss_brainfart@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          I mean, I didn’t say it looked great or anything. Just better than I expected.
          But of course my expectations were extremely low when I saw so many comments like yours, so I was actually pleasantly surprised with what it can do for what it is.

          Though to be fair to the Deck, the native resolution is already so low that there isn’t a whole lot FSR can work with.

      • PenguinTD@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        well, don’t buy NV cards then. I switched and actually feel my dollars worth the purchase. (6800xt)

        • miss_brainfart@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          My next card will be AMD, but that doesn’t change the fact that Nvidia is the biggest authority in this market. They do whatever they want, and AMD doing their best to only be slightly worse isn’t helping.

          • PenguinTD@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            nvidia is using their investor’s dollars really efficiently, which is what leads them to today’s dominance, but also make them like bully toward their business partners(like EVGA, who knew what other vendors are being treated. )

            some of the early investment to push dominance in cuda:

            • NV directly fund researches and provide equipment for accelerated computing(both graphic and non-graphic), which in return researcher are really familiar with cuda and their results improve cuda’s design/driver/compiler. the AI training side eventually leads to tensor cores.
            • NV then use those to help software developers to integrate CUDA-accelerated application, like GPU-renderer, GPU-simulation, GPU-deep learning, GPU-denoiser, GPU-video encoding.
            • NV also helps game developer implement or integrate techs like RTX, DLSS, or ealier ones like hair/physx, etc. And those notorious game specific driver enhancement. ie. they basically work with the game and have ways to set driver side parameters for each game. These collaboration also leads to that GeForce Experience’s auto best quality settings for your pc feature.
            • they also make CUDA only card for number crunching at data center.
            • all above leads to when making purchase, if you are not just playing games, your most viable cost efficient is to buy NV if your work software also use those CUDA features.

            The business plan and result is then positive feedback cycle, crytpo surge of sales or investment money is extra but Nvidia did put them to good use. But above plan make more investors willing to pump money into NV. There are no better business than monopoly business.

            Then, some thing happened for consumer end, don’t know exactly when or reasons they start selling flag ship and crank up their GPU’s prices. People would be like, dude their used GPU with crypto is selling 3x~5x higher then MSRP, why wouldn’t they just increase and get all the revenue themselves. That maybe “part” of the reason but I think they probably testing water in both front(their data center number crunching card were way, way more expensive than even the top tier consumer cards.) They took the chance, with global chip shortage and other “valid reason” to up the price and then check what the market respond, now they have about 2 generation worth of “price gouging” the market data to set their price properly.(plus the door in your face effect. ) Note, big manufacturers sign component deals in years, not quarters, the chip shortage might affect difference sector heavily, like say laundry machines, but for NV you can bet your ass their supply is top priority.

            They did lose out on the console front, and like many already mentioned, NV’s CEO no longer have passion in pushing game tech, he is all AI now. Depending on how they aim their business, their game side gpu business may not doing something really worth mentioning until AMD can put up a serious threat.

      • Whirlybird
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Nvidia can because DLSS is much better than FSR because it has hardware dedicated to it.