Well I am shocked, SHOCKED I say! Well, not that shocked.
The good games don’t need a high end GPU.
Problem is preordering has been normalized, as has releasing games in pre-alpha state.
It doesn’t help that the gains have been smaller, and the prices higher.
I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.
I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. You can compare what percentage of users upgraded this year to previous years.
I just finally upgraded from a 1080 Ti to a 5070 Ti. At high refresh-rate 1440p the 1080 Ti was definitely showing its age and certain games would crash (even with no GPU overclock). Fortunately I was able to get a PNY 5070 Ti for only ~$60 over MSRP at the local Microcenter.
5000 series is a pretty shitty value across the board, but I got a new job (and pay increase) and so it was the right time for me to upgrade after 8 years.
“When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff
Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.
Nowadays the new cards are 10% faster for 15% more money.
I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?
Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.
It’s never been normal to upgrade every year, and it still isn’t. Every three years is probably still more frequent than normal. The issue is there haven’t been reasonable prices for cards for like 8 years, and it’s worse more recently. People who are “due” for an upgrade aren’t because it’s unaffordable.
If consoles can last 6-8 years per gen so can my PC.
Your PC can run 796 of the top 1000 most popular games listed on PCGameBenchmark - at a recommended system level.
That’s more than good enough for me.
I don’t remember exactly when I built this PC but I want to say right before covid, and I haven’t felt any need for an upgrade yet.
For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It’s crazy that anyone would even consider buying it unless they’re rich or actually need it for something important.
Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.
“You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.
4K is an outrageously high resolution.
If I was conspiratorial I would say that 4K was normalized as the next step above 1440p in order to create a demand for many generations of new graphics cards. Because it was introduced long before there was hardware able to use it without serious compromises. (I don’t actually think it’s a conspiracy though.)
For comparison, 1440p has 78% more pixels than 1080p. That’s quite a jump in pixel density and required performance.
4K has 125% more pixels than 1440p (300% more than 1080p). The step up is massive, and the additional performance required is as well.
Now there is a resolution that we are missing in between them. 3200x1800 is the natural next step above 1440p*. At 56% more pixels it would be a nice improvement, without an outrageous jump in performance. But it doesn’t exist outside of a few laptops for some reason.
*All these resolutions are multiples of 640x360. 720p is 2x, 1080p is 3x, 1440p is 4x, and 4K is 6x. 1800p is the missing 5x.
unless they’re rich or actually need it for something important
Fucking youtubers and crypto miners.
Crypto mining with GPUs is dead, the only relevant mining uses ASICs now, so it would be more accurate to say:
Fucking youtubers and AI.
I bought a secondhand 3090 when the 40 series came out for £750. I really don’t need to upgrade. I can even run the bigger AI models locally as I have a huge amount of VRAM.
Games run great and look great. Why would I upgrade?
I’m waiting to see if Intel or AMD come out with something awesome over the next few years. I’m in no rush.
Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
Yeah no shit, what a weird fucking take
Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.
Ex-fucking-actly!
Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.
5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at ‘normie’ AI bros trying to use them online, shit doesn’t work.
4090 is… mediocre because it’s expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.
Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.
The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia’s (and AMD’s) fault for literally being anticompetitive.