The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.
All Nvidia performance plots I’ve seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.
Edit:
Thanks for the heads up.
I really don’t like that new Frame interpolation tech and think it’s almost only useful to marketers but not for actual gaming.
At least I wouldn’t touch it with any competitive game.
Hopefully we will get third party benchmarks soon without the bullshit perfs from Nvidia.
From personal experience, I’d say the end result for framegen is hit or miss. In some cases, you get a much smoother framerate without any noticeable downsides, and in others, your frame times are all over the place and it makes the game look choppy. For example, I couldn’t play CP2077 with franegen at all. I had more frames, but in reality it felt like I actually had fewer. With Ark Survival Ascended, I’m not seeing any downside and it basically doubled my framerate.
Upscaling, I’m generally sold on. If you try to upscale from 1080p to 4K, it’s usually pretty obvious, but you can render at 80% of the resolution and upscale the last 20% and get a pretty big framerate bump while getting better visuals than rendering at 100% with reduced settings.
That said, I would rather have better actual performance than just perceived performance.
Eh I’m pretty happy with the upscaling. I did several tests and upscaling won out for me personally as a happy middle ground to render Hunt Showdown at 4k vs running at 2k with great FPA and no upscaling or 4k with no upscaling but bad FPS.
I wouldn’t say fuck upscaling entirely, especially for 4k it can be useful on older cards. FSR made it possible to play Hitman on my 1070. But yeah, if I’m going for 4k I probably want very good graphics too, eg. in RDR2, and I don’t want any upscaling there. I’m so used to native 4k that I immediately spot if it’s anything else - even in Minecraft.
And frame generation is only useful in non-competetive games where you already have over 60 FPS, otherwise it will still be extremely sluggish, - in which case, it’s not realy useful anymore.
Maybe I’m stuck in the last decade, but these prices seem insane. I know we’ve yet to see what a 5050 (lol) or 5060 would be capable of or its price point. However launching at $549 as your lowest card feels like a significant amount of the consumer base won’t be able to buy any of these.
Sadly I think this is the new normal. You could buy a decent GPU, or you could buy an entire game console. Unless you have some other reason to need a strong PC, it just doesn’t seem worth the investment.
At least Intel are trying to keep their prices low. Until they either catch on, in which case they’ll raise prices to match, or they fade out and leave everyone with unsupported hardware.
Actually AMD has said they’re ditching their high end options and will also focus on budget and midrange cards. AMD has also promised better raytracing performance (compared to their older cards) so I don’t think it will be the new norm if AMD also prices their cards competitively to Intel. The high end cards will be overpriced as it seems like the target audience doesn’t care that they’re paying shitton of money. But budget and midrange options might slip away from Nvidia and get cheaper, especially if the upscaler crutch breaks and devs have to start doing actual optimizations for their games.
Actually AMD has said they’re ditching their high end options
Which means there’s no more competition in the high-end range. AMD was lagging behind Nvidia in terms of pure performance, but the price/performance ratio was better. Now they’ve given up a segment of the market, and consumers lose out in the process.
They’ll sell out anyways due to lack of good competition. Intel is getting there but still have driver issues, AMD didn’t announce their GPU prices yet but their entire strategy is following Nvidia and lowering the price by 10% or something.
Weird completely unrelated question. Do you have any idea why you write “Anyway” as “Anyways”?
It’s not just you, it’s a lot of people, but unlike most grammar/word modifications it doesn’t really make sense to me. Most of the time the modification shortens the word in some way rather than lengthening it. I could be wrong, but I don’t remember people writing or saying “anyway” with an added “s” in anyway but ironically 10-15 years ago, and I’m curious where it may be coming from.
I guess I’m used to saying it since I spent a long time not knowing it’s the wrong pronunciation for it.
By rendering only 25% of the frames we made DLSS4 100% faster than DLSS3. Which only renders 50% of the frames! - NVIDIA unironically
You living in the past, rendering 100% of the frames is called Brute Force Rendering, that’s for losers.
With only 2k trump coins our new graphic card can run Cyberpunk 2077, a game from 4 years ago, at 30 fps with RTX ON but you see with DLSS and all the other crap magic we can run at 280 FPS!!! Everything is blurry and ugly as fuck but look at the numbers!!!
LOL, their demo shows Cyberpunk running at a mere 27fps on the 5090 with DLSS off. Is that supposed to sell me on this product?
The 4090 gets like sub 20fps without DLSS and stuff. Seems like a good improvement.
This is absolutely 3dfx level of screwing over consumers and all about just faking frames to get their “performance”.
They aren’t making graphics cards anymore, they’re making AI processors that happen to do graphics using AI.