Or we could just return to GPU series where consumer models don’t even go anywhere near that 600W figure and simply use one or two 8-pin PCIe power connectors and call it a day.
Consuming this much power for playing a game (and yes, that’s what most people use these cards for) is just silly.
We went through a phase of high-power processors years ago with the Pentium 4 series and its offshoots, then things started getting more efficient instead. I wonder if we will soon see the same for GPUs or if we’ll be stuck on a high-power plateau for a long time.
I don’t see any route to more efficiency. The Pentiums were architecturally “broken” and a step back by Intel made the Core models more efficient. There’s not really a way to do that for Nvidia.
Or we could just return to GPU series where consumer models don’t even go anywhere near that 600W figure and simply use one or two 8-pin PCIe power connectors and call it a day.
Consuming this much power for playing a game (and yes, that’s what most people use these cards for) is just silly.
We went through a phase of high-power processors years ago with the Pentium 4 series and its offshoots, then things started getting more efficient instead. I wonder if we will soon see the same for GPUs or if we’ll be stuck on a high-power plateau for a long time.
I don’t see any route to more efficiency. The Pentiums were architecturally “broken” and a step back by Intel made the Core models more efficient. There’s not really a way to do that for Nvidia.
People keep buying the things, so there is no incentive go small.