PowerColor has come up with an interesting use for neural processing units (NPUs) in modern CPUs. At Computex 2024, it displayed so-called “Edge AI” technology that pairs a graphics card with an NPU to lower power consumption in games.

The thing works by linking an AMD graphics card with a neural processing unit via “PowerColor GUI,” resulting in rather impressive efficiency gains. The manufacturer claims Edge AI managed to lower power consumption in Cyberpunk 2077 from 263W to 205W, which is a 22% improvement. In Final Fantasy XV, the result was also impressive at 18%.

But it is not just energy efficiency, something hardcore PC gamers often dismiss as irrelevant when you need to push frame rates to the very limits. Visitors confirmed that the use of an NPU by PowerColor’s software increased frame rates by 10% when compared to a “dumb” system without a neural processing unit.

  • lemmylommy@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    6 months ago

    The thing works by linking an AMD graphics card with a neural processing unit via “PowerColor GUI,” resulting in rather impressive efficiency gains.

    So, how does it actually work? „Linking“ is too vague to explain anything.

    The only thing I can imagine is some sort of upscaling from a lower resolution, which is hardly revolutionary.

    • tristan
      link
      fedilink
      arrow-up
      9
      ·
      6 months ago

      My guess is similar to Intel XeSS where that’s pretty much what it does, runs the game at lower resolution and uses the npu to upscale it in real-time

      https://game.intel.com/us/xess-enabled-games/

      The biggest difference that this might bring is IF it can work with any game rather than just specific ones

      • Whirlybird
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        That’s what DLSS and AMD super resolution are too btw. DLSS does it with the cuda cores on the nvidia GPU, AMD and Intel does it through software.

  • mrfriki@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    2
    ·
    6 months ago

    If this holds true then this is the first actual useful application of IA I’ve ever seen.

    • KairuByte@lemmy.dbzer0.comM
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      5 months ago

      Only if you’ve not paid attention… There have been AI models for identifying object in images/video for use in home automation/security for quite a while, just to name one. AI models that “learn” habits to curb power usage (though admittedly most implementations of this are dogshit.)

      There are plenty of legitimately useful applications for AI models, shoehorning an LLM into everything and anything is just the most visible because it’s what every tech company and their brother is doing.

      • ashok36@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        Yeah, I’ve been using a Google coral to identify people, cars, animals, etc in my security camera feeds for years now.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    5 months ago

    In other words, interpolation. Guesswork. Hallucinated data, statistically correct-ish.

    And they’re offering this as some tertiary hardware add-on when both Nvidia and AMD have their own “AI” frame-fudging nonsense.