Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

You are viewing a single thread.
View all comments
21 points
*

I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.

Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

permalink
report
reply
28 points

An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

permalink
report
parent
reply
3 points

Perfect answer thank you!

permalink
report
parent
reply
24 points

Lmao

We have your comment: what am I doing with 20gb vram?

And one comment down: it’s actually criminal there is only 20gb vram

permalink
report
parent
reply
4 points

Lol

permalink
report
parent
reply
10 points

Current gen consoles becoming the baseline is probably it.

As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

permalink
report
parent
reply
9 points

I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.

permalink
report
parent
reply
5 points

If only game developers optimized their games…

The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

permalink
report
parent
reply
4 points

GPU rendering and AI.

permalink
report
parent
reply
3 points
*

Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

permalink
report
parent
reply
2 points

Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 543K

    Comments