Thanks, I hate it
I love my 1080ti (╥﹏╥) I really do…
lol i’m running the same card!
I know it’s old and there are newer and better ones out now, but it really is a fantastic card. It’s served me well for a long time, and I guess it will have to continue doing so. Keep chugging little guy ❤︎
The only reason I upgraded my 1080TI was because I’m a huge dummy and got a really expensive monitor that only had HDMI 2.1 and no port. So my 1080TI couldn’t use gsync on it. So after overpaying for a monitor, I overpaid for a graphics card as well. Yay, sunk cost fallacy.
My system kicks ass now, though. Still can barely play Jedi Survivor.
That’s a great card though and if it does what you want/need then you are doing just fine.
Increased sales volume should be bringing prices down, not up. Perhaps the real problem is that we need more production capacity?
In the meantime, I’ll happily use an AMD card for gaming.
Limiting supply has the tendency to increase the price. There’s no incentive to significantly increase supply when they can milk it for all its worth and create increases in price between generations faster than inflation. Just look at the price increases after covid. Prices have come down slightly but there has been a permanent upward shift in GPU prices.
Just look at the price increases after covid.
It’s not really an apples-to-apples comparison. Most of the things people continue buying at prices inflated by covid issues and corporate greed are things they need, like food. Graphics cards are more of a luxury.
If GPU prices stay high, I simply won’t buy them. Even if I were to finally relent when my current one eventually dies, I would be buying them far less often than I otherwise would, meaning less profit for the sellers in the long term.
This sounds more like the prices would be going up because a new emerging hobby has a need for multiple extremely powerful GPUs the same way crypto mining did. That hobby just happens to be running AI at home instead of using cloud services. So there would be a drop in supply as a few hoarders snap up everything for that or to scalp at higher prices.
They just literally can’t make them fast enough to keep prices stable against the shitty kinds of consumers.
Yes, the problem is production capacity, but it’s very difficult to get that capacity up and running. For example, Intel started building 2 factories in Ohio last year. They won’t be up and running until at least 2025.
This stuff is complicated and nobody predicted the rise of covid, crypto currencies, or AI, or if they did nobody was convinced enough to dedicate potential billions of dollars to building capacity to capitalize on it.
Guess I’ll be hanging on to my 1060 for a bit longer… Been looking to upgrade for a while now but I just can’t justify the prices, especially in Canadian dollars.
GPUs tend to be cheaper/more affordable on AMD’s side. My local PC shop had decent deals on the RX6600XT and RX6750XT lately. For PC gaming only, they do the job.
If you don’t mind switching from Nvidia to AMD, I’d check them.
Yeah I don’t really do much with my desktop anymore outside of gaming so AMD would definitely be good enough. I just haven’t gotten lucky when I remember to check prices lol, I seem to miss a lot of the deals when they come up.
Would that be decent for 1440p at 100-120hz?
I do run 1440p/120Hz on most of my games and I don’t even have the 6650XT (I own an ASRock PG 6600XT).
It would be more than enough for most titles in medium-to-high settings at 1440p/120Hz. Some recent games (yes, you, Diablo 4) will max out the 8GB vRAM so you’ll have to tinker with settings until you hit a sweetspot but it manages 1440p.
Edit; as I wrote the reply, the card price increased to 400 CAD. Keep looking for deals, add them to your saved/watch list and order then.
Awesome!! Really appreciate the recommendation!! I’ll definitely be keeping an eye out for it. I’d love to be able to finally move my 1060 to my unraid server for transcoding.
got me an RX-6650-XT mech from amazon for just under $400 after tax and other bs (canadian $$) keep an eye on the card(s) you want and hold out till a sale. Also, I only looked at the msi store, not the rando’s selling cards. I know lots can be trusted, but so many are just scammers so I didnt want the chance of a problem.
I’m surprised we haven’t seen TPU cards (think Coral AI but at a larger scale) being made and sold for this purpose, especially if they’re faster and more energy efficient at AI-oriented tasks than GPUs.
Well there is the asus AI Accelerator card but that’s just 8 coral TPUs. I think the real reason why we don’t see large TPUs is because nvidia cards has had tensor cores built into the architect since Volta and with a GPU you don’t have to worry about system memory speed since you have 80Gb of HBM.
Glad I snatched a used 3090 for like USD 600
I feel like this could be completely avoided if Nvidia would just make a reasonably priced, widely available specialized AI card with no video outputs or game features.
Currently their cards geared towards AI are insanely priced and not very attractive with such a poor price/performance ratio vs relatively cheap gaming cards.
And all of their insanely over priced AI dedicated cards are selling out at max production, leading to record earnings for Nvidia.
I wouldn’t expect they upset that apple cart until amd/intel force their hand.
And prices hadn’t even dropped to a level where I could afford a new GPU.
I guess I’ll keep sticking to what I have.
1050ti gang rise up
You poor bastard. My GF has that GPU. The only game she can max out is the Sims (but that’s all she plays so it’s good enough for her.) the 1050ti can’t even handle Wallpaper Engine beyond 15 FPS.
Max settings are pretty meaningless tbh I don’t notice much difference in a lot of cases anyway, fps is where its at
And before that it was 750ti soooo
I guess that should also translate into possibly higher prices for the Mac Pro/Studio line? Or not? I don’t really know, but not worrying about GPU prices feels nice.
Apples neural engine cores aren’t as good as nvidias tensor cores, which is considered “the best” hardware for AI (I don’t know what AMD does). Even the Mac Pro with pcie can’t run GPUs, so that shouldn’t be an issue. Nobody bought Macs for mining crypto in 2021, so it’s less likely people will do it with AI.
Hmm, archive.is is blocking me with a constant captcha, passing it just reloads the block page again, it seems like their captcha system is broken or something.
Well, that’s counter-intuitive since the archive.org is all about preservation and availability of data, here’s the full link : https://metro.co.uk/2023/08/04/pc-graphics-cards-to-get-more-expensive-again-thanks-to-ai-boom-19269496/
archive.is is privately funded by its owners, while archive.org is a registered non-profit. They are not the same.
How big do they think the AI market is going to be? It is not going to compete with the consumer demand for very long. Chips last for years, so once AI chip is purchased it will be in use until the next generation GPU arrives. So yes the initial purchase may be predominantly expensive AI chips that will make AMD and Nvidia a boatload of cash. But that is a finite market, and TSMC make a lot of chips. Chips for industry have always be at the forefront of the cash cow that is GPU and CPU sales. Intel is also making an entrance for the low end.
I think I will be waiting a while. I have little interest in being gouged for no other reason than greed.
Yeah they’re nuts if they think consumer grade graphics cards are of any use to anyone seriously dealing with ai.
The biggest thing holding back cards right now, even the 4090 is vram. AI needs a ton of it, especially if you’re trying to train an AI.
More than likely, the will be more demand for those 48 gig+ enterprise cards.
But AI begin to be a consumer product too. It’s no more just bots to help tech support, but products used by people (search in Bing with the help of ChatGPT, CoPilote from GitHub to help developers write code and so on). With increased usage it’ll need more GPU to calculate more answers. And I’m not sure which market is bigger.
I can wait a couple generations till the next AI winter . Or until we have widespread dedicated AI chips, this graphics card training needs to stop