PowerColor has come up with an interesting use for neural processing units (NPUs) in modern CPUs. At Computex 2024, it displayed so-called “Edge AI” technology that pairs a graphics card with an NPU to lower power consumption in games.
The thing works by linking an AMD graphics card with a neural processing unit via “PowerColor GUI,” resulting in rather impressive efficiency gains. The manufacturer claims Edge AI managed to lower power consumption in Cyberpunk 2077 from 263W to 205W, which is a 22% improvement. In Final Fantasy XV, the result was also impressive at 18%.
But it is not just energy efficiency, something hardcore PC gamers often dismiss as irrelevant when you need to push frame rates to the very limits. Visitors confirmed that the use of an NPU by PowerColor’s software increased frame rates by 10% when compared to a “dumb” system without a neural processing unit.
So, how does it actually work? „Linking“ is too vague to explain anything.
The only thing I can imagine is some sort of upscaling from a lower resolution, which is hardly revolutionary.
My guess is similar to Intel XeSS where that’s pretty much what it does, runs the game at lower resolution and uses the npu to upscale it in real-time
https://game.intel.com/us/xess-enabled-games/
The biggest difference that this might bring is IF it can work with any game rather than just specific ones
That’s what DLSS and AMD super resolution are too btw. DLSS does it with the cuda cores on the nvidia GPU, AMD and Intel does it through software.
so like FSR 1 but with AI uoscaling. Does sound somewhat exciting