Have you compared NES games on a CRT with the same games on a modern screen?
CRTs just look miles better.
EDIT: OK, it’s ackchually not technically “resolution” per se, I get it. :p
That’s because the graphics were tailored to CRT resolution - which is to say, [things that just so happened to have] low/outright bad resolution.
CRTs have advantages over more modern stuff but that’s mostly about latency.
It’s not as much about resolution as it was about exploiting the quirks of CRT. Artists usually “squished” sprites horizontally (because crt screens would stretch them) and used the now famous “half dot” technique to have more subtle shading than what was actually possible at the pixel level. So if you just display the original sprites with no stretch and no “bleed” between pixels, it doesn’t look as good as it should.
That’s because the graphics were tailored to CRT resolution - which is to say, low/outright bad resolution.
No, it’s because the graphics were tailored to the analog characteristics of CRTs: things like having scanlines instead of pixels and bleed between phosphors. If they were only tailored to low resolution they’d look good on a low resolution LCD, but they don’t.
I admit I’m quibbling, but the whole thread is that, so…
Maybe one’s before the 80s didn’t, but they almost all do after that. Exactly how this worked varied between manufacturers and the display type, but they tended to have some kind of mask that pushes it to a preferred resolution.
http://filthypants.blogspot.com/2020/02/crt-shader-masks.html
CRTs don’t have pixels so the resolution of the signal isn’t that important. It’s about the inherent softness you get from the technology. It’s better than any anti-aliasing we have today.
Crts were jagged and blurry. A misconverged pixel isn’t good anti aliasing.
https://en.m.wikipedia.org/wiki/Missile_Command#/media/File%3AA5200_Missile_Command.png
CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
The exact mechanism varied between manufacturers and types: http://filthypants.blogspot.com/2020/02/crt-shader-masks.html
I certainly saw aliasing problems on CRTs, though usually on computer monitors that had higher resolution and better connection standards. The image being inherently “soft” is related to limited resolution and shitty connections. SCART with RGB connections will bring out all the jagginess. The exact same display running on composite will soften it and make it go away, but at the cost of a lot of other things looking like shit.
CRT filters exist now, and with HDR output (or just sending an HDR-enable signal to get tv’s to use the full brightness range) and 4k displays it honestly as good at this point. or better because the only good CRT’s you can get now are pretty small P/BVM and my tv is much bigger than those
There are plenty of upscalers with minimal latency that fix that.
There also isn’t just “CRT” in this space. Professional video monitors give a very different picture than a consumer TV with only the RF converter input.
If one more under 25 retro fan tells me that RF tuners are the “true experience”, I’m going to drink myself to death with Malort.
Edit: please don’t tell me you believe CRTs have zero latency. Because that’s wrong, too.