Crypto, followed by NFTs, followed by LLMs… The GPU market has been fucked for years now. Whenever something starts to drop off, another tech bro idea that requires 10,000 GPUs to process takes its place.
Truly just the brute force solution. Need a shitload of compute? GPUs can do it! No one stops to think if we really need it. It’s all about coulda, not shoulda. Yeah, ML and AI has a place, but big tech just thinks “slap an LLM everywhere”. Just such horseshit
I just kept an eye on Micro Center’s refurbished cards for a few weeks and was able to snag a 3090Ti late last year with a 3-yr warranty for the same price I paid for a 980Ti in 2015.
I think that might be my plan too, but I’m still waiting a paycheck or two before I even monitor the situation. My 2070 is fine and ultimately I just want to pass it down to a spare PC for kids to mess around on as my oldest hits 3. I know my the time I hit 5 I was playing shit like Dune 2, admittedly with hacked save files my dad setup.
The 9070’s on eBay are getting cheaper and cheaper the further we get from the launch. I think scalpers underestimated AMD’s stock and they are slowly discovering that.
Immediately after the launch the XT seemed to be starting at $1,200. Now they are down to $800. The non-xt is down to $650.
Depends on how much stock AMD can provide in the coming weeks and months, but I’m still thinking I’ll be able to get one at MSRP this year.
I’m not sure where you are seeing this but on eBay they are still showing a ton of cards from 12-1800.
This shit is a problem and only the apathetic retailers can fix it.
Being bought, or being sold? There’s a difference. If they’re not being bought at those numbers, they’ll still show up the most.
They also said “starting”, which implies that’s what it’s being sold at, not what you see the most listings for.
It still see them more around 800-900€ which is more than what i paid for a 7900XT wtf
I’m having a good time on a laptop with no fancy graphics card and have no desire to buy one.
I also do not look for super high graphical fidelity, play mostly indies instead of AAA, and am like 5 years behind the industry, mostly buying old gems on sale, so my tastes probably enable this strategy as much as anything else.
Modern high end iGPUs (e.g. AMD Strix Halo) are going to start replacing dGPUs in the entry and mid-range segments.
I’ll be honest, I have never paid attention to GPUs and I don’t understand what your comment is trying to say or (this feels selfish to say) how it applies to me and my comment. Is this intended to mostly be a reply to me, or something to help others reading the thread?
Laptops (and desktops) with no GPUs will become increasingly viable not just for older games. This was a general comment. :)
Thank you for explaining!
Your laptop uses an iGPU. The “i” stands for integrated, as it’s built into the same package as the CPU.
The alternative, a dGPU, is a discrete part, separate from other components.
They’re saying that your situation is becoming increasingly common. People can do the gaming they want to without a dGPU more easily as time goes by.
Thank you for explaining! I am not sure why people are reacting badly to my statement, is knowledge of GPUs something every gamer is expected to have and I am violating the social contract by being clueless?
Well at one point to be a computer gamer you basically needed to put together your own desktop PC.
Integrated GPUs basically were only capable of displaying a desktop, not doing anything a game would need, and desktop CPUs didn’t integrate graphics at all, generally.
So computer-building knowledge was a given. If you were a PC gamer, you had a custom computer for the purpose.
As a result, even as integrated GPUs became better and more capable, the general crowd of gamers didn’t trust them, because it was common knowledge they sucked.
It’s a lot like how older people go “They didn’t teach you CURSIVE?” in schools nowadays. Being a gamer and being a PC builder are fully seperatable, now, but they learned PC building when they weren’t and therefore think you should have that, too.
It’s fine, don’t sweat it. You’re not missing out on anything, really, anyway. Especially given the current GPU situation, it’s never been a worse time to be a PC builder or enthusiast.
Oh boy. Thanks for the context, by the way! I did not know that about the history of PC gaming.
I did learn cursive, but I have been playing games on laptops since I was little too and was never told I had to learn PC building. And to be completely honest, although knowledge is good, I am very uninterested in doing that especially since I have an object that serves my needs.
I have the perspective to realize that I have been on the “other side” of the WHAT DO YOU MEAN YOU’RE SATISFIED, LEARN MORE AND CHANGE TO BE LIKE US side, although I’m exaggerating because I don’t actually push others to take on my decisions. I don’t spam the uninterested to come to Linux, but I do want people who get their needs adequately served by Windows to jump to Linux anyways because I want to see Windows 11, with even more forced telemetry and shoved-in AI and things just made worse, fail. Even though that would actually be more work for satisfied Windows users.
But I would not downvote a happy Windows user for not wanting to switch, and that kind of behavior is frowned upon, is it just more acceptable to be outwardly disapproving to those who do not know about GPUs and are satisfied with what they have with zero desire to upgrade? I don’t have Sufficient Gamer Cred and am being shown the “not a Real Gamer” door? I think my comment was civil and polite so I really don’t understand the disapproval. If it is just “not a Real Gamer” I’ll let it roll off my back, though I did think the Gaming community on Lemmy was better than that… I would understand the reaction if I rolled up to c/GPUs with “I don’t care about this :)” and got downvoted. Is Gaming secretly kind of also c/GPUs and I just did not know that?
Okay I literally just realized it is probably because I hopped on a thread about GPUs and do not know about the topic being posted about. Whoops. Sorry.
Same here, have never owned a graphics card in my life. When I occasionally do want a modern game it doesn’t need to be 200FPS RTX.
I’ve been using minipcs with integrated graphics ( and one with a laptop class GPU) instead of desktops and see no reason to stop.
Funny thing about AMD is the MI300X is supposedly not selling well, largely because they priced gouge everything as bad as Nvidia, even where they aren’t competitive. Other than the Framework desktop, they are desperate to stay as uncompetitive in the GPU space as they possibly can, and not because the hardware is bad.
Wasn’t the Intel B580 a good launch, though? It seems to have gotten rave reviews, and it’s in stock, yet has exited the hype cycle.
I looked for months for a b580 for my wifes pc. Couldn’t get one in stock for MSRP during that time. Finally caved and grabbed a 6900xt for $400 used. The intel cards are awesome, if you can get one. I do hope intel keeps up the mid range offerings at sane prices
Intel b580 still showing out of stock everywhere I normally look.
Well the B580 is a budget / low-power GPU. All the discussions going around are for flagship and high-end GPUs. Intel isn’t in that space yet, but we can hope they have a B7xx or B9xx lined up which makes some waves.
Probs the only reason for many to buy a console these days. For the cost of a high end GPU you can get an entire system and some games.
It’s pretty okay if you’re like me, i.e. have no needs above full hd res and can either take or leave rtx
It should still be better. Or at least cheaper.