In thought that 60Hz was enough for most games, and that for shooters and other real time games 120 or 144 was better. However, it reaches a point where the human eye can’t notice even if it tried.
Honestly, going up in framerate t9o much is just a waste of GPU potency and electricity.
A better way to look at this is frametime.
At 60 FPS/Hz, a single frame is displayed for 16.67ms. At 120 Hz, a single frame is displayed for 8.33ms. At 240 Hz, a single frame is displayed for 4.16ms. A difference of >8ms per frame (60 vs 120) is quite noticeable for many people, and >4ms (120 vs 240) is as well, but the impact is just half as much. So you get diminishing returns pretty quickly.
Now I’m not sure how noticeable 1000 Hz would be to pretty much anyone as I haven’t seen a 1000 Hz display in action yet, but you can definitely make a case for 240 Hz and beyond.
It’s pretty easy to discern refresh rate with the human eye if one tries. Just move your cursor back and forth really quickly. The number of ghost cursors in the trail it leaves behind (which btw only exist in perception by the human eye) is inversely proportional to the refresh rate.
Sure, but wasting double or triple the resources for that is not fine. There’s very limited places where that even is a gain on games, because outside those super competitive limited games it’s not like it matters.
Yeah I agree with you, but I was just refuting your claim that it’s not perceivable even if you try.