Old consoles can be connected to an LCD monitor using a device called RetroTINK, which can add this effect perfectly. For static images, software like Photoshop, Affinity Photo is sufficient, but the goal should always be a CRT effect rather than generic scaling or fancy blur.
The point is that OBJECTIVELY pixel art looks incomparably better on CRT monitors, which is why this effect is emulated.
"Objectively" doesn't just mean a thing is a strongly held opinion or even widely held. This seems like a perfect example of a thing that is subjective, not objective. There is no objective metric for measuring the looks of pixel art. Or really any art in general.
Probably most people who care prefer this, but that doesn't mean it's objective.
This probably doesn't contribute to the discussion. But I have a personal peeve about people using the word "objectively" (and "demonstrably") when they really mean "significantly".
Carry on.
Not true anymore for modern pixel art, which is often an art style intended for modern displays, and it is sometimes combined with high resolution images and transforms.
Example: https://x.com/kitten_beloved/status/1849159022479577277
Plenty of people debunking it including the person she took the screenshots from.
who is that in the thread please? I can't find them
The graphics cards themselves also mattered, RAMDACs aren't perfect.
What deep blacks? CRTs didn't have them; "black" was really gray. You can see it yourself: go find an old CRT monitor, make sure it's powered off and you have a reasonable amount of ambient light for normal viewing conditions, and look at the screen. It's gray, not black. That's as black as the screen gets. Now try the same with any modern OLED screen; the off state is much darker.
I have never agreed with the supposition that CRTs have deep blacks. The "black" is clearly gray and it was always very noticeable to me when watching CRTs in a lit room. This is one of the things that appealed to me about LCDs in the first place.
In a pitch black room maybe CRTs have better blacks than LCDs (but even then there's CRT glow!), but LCDs have better blacks in a lit room, which is a far more likely scenario for me. Consequently since the beginning of LCD use I've always thought of them as being more vibrant with better contrast.
One thing you can do on a CRT that is difficult/impossible to simulate on an LCD is proper vector graphics. Vectrex games have a really cool glow to the lines as do arcade cabinets (Star Wars / Asteroids / Battle Zone). I wonder how closely OLEDs can mimick that.
Can be decently emulated with more modern shaders that rely upon HDR, provided your HDR monitor is bright enough, which most are not. My display can do a reasoable job with 1600+ nit peaks, and 1200 nits sustained. OLED's are not really capable here due to a lack of ability to push and sustain decent brightness levels. You'll also want 4K, in an ideal world, 8K would be even better, but we are where we are.
> deep blacks
CRT blacks were really not that deep unless you're sitting in the dark and there is nothing else on the screen. It also depended upon model, coating etc. Even in perfect scenarios, contrast in mixed scenes was "meh" at best.
> High resolution OLED can come close
So far my experience is that it can not, as it's simply not capable of the brightness required, but it does offer nice blacks yes and better than LCD motion (though just barely due to sample and hold.)
I'd say the biggest remaining issue honestly is the motion blur inherent to sample and hold. As close as the more advanced shaders are getting today, it all falls apart when the image starts to move. Retroarch supports BFI, but its not as useful as it sounds for various reasons sadly.
For now, I retain my broadcast CRT's, but I do hope to get to a point eventually where I could get rid of them. Though I suspect by the time such a technology arrives and is useful, i'll be old enough that i'll probably have stopped caring.
My GF would love me to give up CRT's as I have a room full of them which she tolerates, but hardly loves :|
On an LCD, each pixel lights up and illuminates constantly without flicker. On a CRT, an unrefreshed image immediately fades away.
For desktop computing, the minimum ergonomic standard refresh rate was usually 70Hz. Even that was an eye strain over prolonged usage. I remember using 85Hz on my 1990s Eizo CRT.
Because I remember that, the bloodshot eyes after too long a night of gaming. Just hadn't realised it was because of the screen, partly at least, given that explanation.
They’re also not the only high quality scalers made for retro gamers. There’s quite a number of different options available these days.
I’m not sure these come close because there’s some sort of physical element that would be hard to replicate unless you mapped the DPI of a screen to the “DPI” of a CRT.
Otherwise you’re just creating a weird facsimile in the same way that a lot of indie artists don’t produce pixel art that is actually pixel aligned. It’s ugly.
For a truly authentic CRT experience you need a faint smell of ozone, the crackle of a static charge on the screen and a high-pitched screaming/whining noise right on the edge of perception.
The thunk of turning off my CRT+VHS combo after a late night watching reruns as a tween. Nostalgia is hell of a drug.
My parents were less impressed, when after a few years the screen was moved and the wall was scratched everywhere.
Same. Because they all try way too hard.
I have a fully working vintage arcade cab from the mid eighties which I still play on. I know. Most of these shaders and techniques exaggerate way too much what things really looked like. There's a tiny blur and there are tiny scanlines (or whatever these little black lines are called) but things... Mostly looks pixelated.
And that's an old, used, CRT I have: probably one of the blurriest one. Back in the nineties we already had fancy Sony Trinitron CRTs and these were flawless. Pixels just looked like pixels, not like all these blurred things nor like all these exaggerated shaders. Many CRTs were really crisp.
Do games from the eighties look better on a CRT? Definitely. But it was subtle.
Pixel art is pixel art and it's not pixel art because it was shown on a CRT and suddenly it wouldn't be pixel art anymore because it's shown on a modern monitor.
Things were really just "blocky" and pixelized. That's really how things looked.
I wonder if we will ever get a nostalgic style that emulates all those flash games. Reasonably high resolution components, but only 10 or so pieces per character. Geometric shapes with gradients.
When it comes to old pixel art games though (as opposed to the new ones), it's a matter of accuracy. There's plenty of articles and videos showing how different it is to try to use a naive emulator on a modern, upscaled OLED vs how the very same game looks in a surviving old Trinitron with a SCART cable. If you are looking at, say, old Atari 2600 games, there's no reason to try to pretend to be a Trinitron. But for SNES? Sonic in the Genesis? Reproducing the screen with square, perfect pixels often looks worse.
Still, flash games are getting emulated, and so do Quake-era FPSes. Sometimes we rediscover older gameplay, or more readable art. Other times it's only nostalgia. But pixel art in itself? It's just effective. Modern games just throw away some of the limitations that didn't make the games better: Go look at Sea of Stars. We couldn't have made that game work in a SNES: Too much memory, too wide a palette, more animation we could ever fit in that hardware. And yet, it's a descendent of the old RPGs stylystically, and it looks absolutely fantastic by any standard.
https://www.shadertoy.com/view/4dlyWX
But more practical are:
https://www.shadertoy.com/view/XsjSzR
One way you can tell this nostalgic quest is a little silly is by the fact that new indie pixel art games are mostly excluded from this nitpicking.
I lived through the CRT > LCD transition and the only downside to LCDs at the time was A) resolution interpolation, and B) motion blur. (Both of these issues have since been addressed.)
When CRTs were the norm we were never satisfied with their crispness. We always yearned for more clarity and a smaller dot pitch. When you saw a game displayed on a sharp monitor the improvement was both obvious and somewhat amazing. But now we've finally got what we want in the form of high-resolution LCDs and OLEDs and we're trying hard to find new faults to be fixed, haha.
I am a bit of a hypocrite: I like a good CRT overlay on my retro games. It invokes a feeling. But I won't say it's objectively better.
"Whatever you now find weird, ugly, uncomfortable and nasty about a new medium will surely become its signature. CD distortion, the jitteriness of digital video, the crap sound of 8-bit - all of these will be cherished and emulated as soon as they can be avoided. It’s the sound of failure: so much modern art is the sound of things going out of control, of a medium pushing to its limits and breaking apart. The distorted guitar sound is the sound of something too loud for the medium supposed to carry it. The blues singer with the cracked voice is the sound of an emotional cry too powerful for the throat that releases it. The excitement of grainy film, of bleached-out black and white, is the excitement of witnessing events too momentous for the medium assigned to record them."
Summation: A pixel is a "picture element," a sampling of the intent.
On CRTs, the phosphors would sample the electron beams, which in turn sampled the memory. The phosphors, when hit, would diffuse in a perfectly round manner. As the voltage and intensity of the beam increases, the rays become more plentiful and the diffusion dilates; the output brightness becomes non-linear. In modern displays, this non-linearity is corrected for with "gamma."
So we have two changes in modern displays that affect the way the picture is presented to the eyes:
1. Square edges. These don't exist with CRTs, barring double-scan and prescaling.
2. Dilation. Pixels of higher brightness on a CRT occupy more area than that allocated for a pixel on an LCD. Brightness bleeds over into neighboring pixels, (importantly) making dark lines finer.
So, objectively, pixel art originally displayed on CRTs needs to be altered to have the same appearance on a LCD. The worst problem I see personally is that a bilinear-filter is often used, but it does the interpolation in gamma-space instead of linear. This causes dark lines and black areas to become more pronounced and blurry. This, coupled with the lack of dilation completely changes the character of the image.
About artistic intent, I can provide an anecdote as a counterexample: Shigeru Miyamoto said early sprites were first laid out on graph paper--as square blocks. There's photos out there, and the blocks are filled in completely and are very square. This was early on, so I don't know if they went back and adjusted them, or if later artists often used the intended display to model their art or not.
img {
image-rendering: pixelated;
}
The image they used is biased toward horizontal.
The ground (and blocks beneath the ground) have strong horizontal lines. As does the fence wall behind the main character, and the main characters gun is horizontally elongated.
This is so wrong. CRTs have unique properties that can not really be replicated on an LCD monitor. You can get something similar with a really high refresh OLED panel, but that needs to be verified.
It's nice to have CRT filters and nonlinear scaling available as an option, but I'll never use them. If the game can't scale up perfectly, then I'd rather just have black bars on the edges than making everything blurry or adding extra pixels where they don't belong.
Of course well-working graphic designers of the time measured their effort on the final result in the intended medium,
but of course they were also compromising on the amount of information they could throw in, and there exist (formal and informal) studies and work towards information augmentation, regardless of the medium. Or, because the original medium is largely unavailable and you want past work to look well on current media (so you try to augment information to make old works look better on LCDs and similar).
tldr; VGA always looked like crap on most LCDs, imho they were almost unusable until DVI.