There might be a good reason for this. Raster effects were already really good in newer games, and ray tracing could only improve on that high bar. It’s filling in details that are barely noticeable, but creap ever so slightly closer to photorealism.
Old games start from a low bar, so ray tracing has dramatic improvement.
It’s just because newer games have too much to effectively ray trace, so they have to use it in a very limited manner. There are very few games fully ray traced.
Ray traced quake looks more like real video than a lot of those modern games do; it just looks like some kind of theme park/old theater costume type of deal with a lot of rubber because the materials aren’t as good.
Yeah, for sure. Raytracing is very computationally intensive. It doesn’t make sense to do full-scene raytracing unless you have hardware that’s specifically designed for it. It works for something like quake since none of the scenes are particularly complex, but obviously you don’t hit anything close to the same framerates as you would with raster rendering.
Imo it has less to do with photorealism vs non-photorealism and more to do with pbr (physically based rendering) vs non-pbr. The former attempts to recreate photorealistic graphics by adding additional texture maps (typically metallic/smooth or specular/roughness) to allow for things ranging from glossiness and reflectivity, to refraction and sub-surface scattering. The result is that PBR materials tend to have little to no noticeable difference between PBR enabled renderers so long as they share the same maps.
Non-pbr renderers, however, tend to be more inaccurate and tend to have visual quirks or “signatures”. For an example, to me everything made in UE3 tends to have a weird plastic-y look to it, while metals in Skyrim tend to look like foam cosplay weapons. These games can significantly benefit from raytracing because it’d involve replacing the non-pbr renderer with a PBR renderer, resulting in a significant upgrade in visual quality by itself. Throw in raytracing and you get beautiful shadows, speculars, reflections, and so on in a game previously incapable of it.
Cyberpunk is a good example of gorgeous raytracing: https://www.youtube.com/watch?v=3pkuU0cGQu8
The problem is that proper raytracing is way too heavy for most machines, so game devs don’t bother. The Cyberpunk example on max graphics would need an RTX 4090 just to run it over 60fps. No point in pushing tech that nobody can run yet.
Raytracing on older games looks great because they already weren’t intensive to run, so developers can get away with maximizing raytracing while still running fine.
Control also did a fantastic job. At some point the reflections on the glass are almost too good. There is a puzzle where you need to look through some windows to solve it. I couldn’t see what the hell was going on because of the reflections and had to turn RTX off. It was otherwise great and I think the difference is dramatic.
I’ve been playing Cyberpunk on an RX 7900XT at basically maxed out graphics, with only some of the raytracing reduced to get it to a consistent 60 fps. The game looks stupid good. But the raytracing is only for shadows and reflections and it has such a massive impact on performance, though I know my GPU is not as effective at raytracing as Nvidia would be.
Like the other reply mentions, Control also looks great with raytracing on, but the scale is not the same as Cyberpunk, so the framerates don’t suffer as much.
I think the tech industry misses the good old days when the upgrade was noticable and exciting. Now it’s just this big media blitz and the upgrades are not that noticeable. Lol, nothin like watching a 30 minute video that they flip back and forth with. “Oh, I can see the difference”
I totally agree. Quake GL improved quake 100 fold. RT quake did the same all over again