The proper way to scale pixel graphics is by using nearest-neighbor (integer scaling) + CRT shader. Some games implement these filters excellently (eg Black Jewel, Hammerwatch (only the very first part), Animal Well), while others do it poorly (eg Skald).

Old consoles can be connected to an LCD monitor using a device called RetroTINK, which can add this effect perfectly. For static images, software like Photoshop, Affinity Photo is sufficient, but the goal should always be a CRT effect rather than generic scaling or fancy blur.

The point is that OBJECTIVELY pixel art looks incomparably better on CRT monitors, which is why this effect is emulated.

> OBJECTIVELY pixel art looks incomparably better on CRT monitors

"Objectively" doesn't just mean a thing is a strongly held opinion or even widely held. This seems like a perfect example of a thing that is subjective, not objective. There is no objective metric for measuring the looks of pixel art. Or really any art in general.

Probably most people who care prefer this, but that doesn't mean it's objective.

This probably doesn't contribute to the discussion. But I have a personal peeve about people using the word "objectively" (and "demonstrably") when they really mean "significantly".

Carry on.

I think the objectivity here is that it is what the artist intended.

Not true anymore for modern pixel art, which is often an art style intended for modern displays, and it is sometimes combined with high resolution images and transforms.

That's an often repeated claim people make about pixel art and CRT monitors esp on social media, but I think it's just a trite bite that sounds good rather than something that's meaningful.

Example: https://x.com/kitten_beloved/status/1849159022479577277

Plenty of people debunking it including the person she took the screenshots from.

>including the person she took the screenshots from.

who is that in the thread please? I can't find them

That doesn't debunk it at all - but some good points were made. Note that the artists of the day were also using CRT displays just higher resolutions. They certainly did test their work on the target. Some people obviously did more tweaking based on what they saw on the target machine and some less. This continues to this day.
That's a screenshot of a video made by tiktok user "mylifeisanrpg". I don't know where the image of the sprites came from but Fisch doesn't say he made it.
"Objectively the artists intended the art to be viewed on a CRT" is a very different statement than "Objectively the art looks better on a CRT"
Yeah, it's just not your grandmother's pixel art anymore
If we had an artist's statement of intent for a particular piece, that might be a reasonable argument.
  • dumah
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
We don’t need a statement of intent because we have historical records of their work process simultaneously using both TV and computer displays.
  • rifty
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
Intersubjective is also a word that nicely fills in the ground between subjective and objective.

https://en.wikipedia.org/wiki/Intersubjectivity

Thank you for pointing that out. I agree that using the word „objectively” in this case was a bit silly, and I apologize for that.
There's a couple of effects that CRTs make that simply cannot be reproduced on LCD as well, even with advanced filters. The pixel glow and deep blacks are just locked behind the glowing phosphor technology. High resolution OLED can come close, but those displays are still pretty expensive.
Do not forget though that not all CRTs were made the same. There was a huge variation in dot pitch, or even the "subpixel layout" (think Trinitron). Also, not all CRTs had nice black levels; either the screen still reflected/scattered a lot of ambient light, or some CRTs just had a black offset level that ensured even fully black pixels still emitted some light, or a combination of both. Phosphor decay times also varied.

The graphics cards themselves also mattered, RAMDACs aren't perfect.

>The pixel glow and deep blacks are just locked behind the glowing phosphor technology.

What deep blacks? CRTs didn't have them; "black" was really gray. You can see it yourself: go find an old CRT monitor, make sure it's powered off and you have a reasonable amount of ambient light for normal viewing conditions, and look at the screen. It's gray, not black. That's as black as the screen gets. Now try the same with any modern OLED screen; the off state is much darker.

In my room I have a Trinitron and a pair of 1440p OLED monitors. You're right in that in a lit room the OLEDs have deeper blacks, but in a dim room the glow of the Trinitron's lit phosphors create a unique effect contrasted with the unlit ones. I might have to experiment more with the various visual filtering softwares that exist, but I think the physical properties of a CRT mean what hits your eyes is just unreplicable.
> deep blacks

I have never agreed with the supposition that CRTs have deep blacks. The "black" is clearly gray and it was always very noticeable to me when watching CRTs in a lit room. This is one of the things that appealed to me about LCDs in the first place.

In a pitch black room maybe CRTs have better blacks than LCDs (but even then there's CRT glow!), but LCDs have better blacks in a lit room, which is a far more likely scenario for me. Consequently since the beginning of LCD use I've always thought of them as being more vibrant with better contrast.

One thing you can do on a CRT that is difficult/impossible to simulate on an LCD is proper vector graphics. Vectrex games have a really cool glow to the lines as do arcade cabinets (Star Wars / Asteroids / Battle Zone). I wonder how closely OLEDs can mimick that.

> The pixel glow

Can be decently emulated with more modern shaders that rely upon HDR, provided your HDR monitor is bright enough, which most are not. My display can do a reasoable job with 1600+ nit peaks, and 1200 nits sustained. OLED's are not really capable here due to a lack of ability to push and sustain decent brightness levels. You'll also want 4K, in an ideal world, 8K would be even better, but we are where we are.

> deep blacks

CRT blacks were really not that deep unless you're sitting in the dark and there is nothing else on the screen. It also depended upon model, coating etc. Even in perfect scenarios, contrast in mixed scenes was "meh" at best.

> High resolution OLED can come close

So far my experience is that it can not, as it's simply not capable of the brightness required, but it does offer nice blacks yes and better than LCD motion (though just barely due to sample and hold.)

I'd say the biggest remaining issue honestly is the motion blur inherent to sample and hold. As close as the more advanced shaders are getting today, it all falls apart when the image starts to move. Retroarch supports BFI, but its not as useful as it sounds for various reasons sadly.

For now, I retain my broadcast CRT's, but I do hope to get to a point eventually where I could get rid of them. Though I suspect by the time such a technology arrives and is useful, i'll be old enough that i'll probably have stopped caring.

My GF would love me to give up CRT's as I have a room full of them which she tolerates, but hardly loves :|

OLED cannot make my eyes red and burning even after 24 hours of looking at the screen. We need something better to emulate CRTs properly.
  • onli
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
Is that true? Was there something to the technology that was harder on the eyes?
Low refresh rates. 50Hz for European TV (PAL), 60Hz for American (NTSC).

On an LCD, each pixel lights up and illuminates constantly without flicker. On a CRT, an unrefreshed image immediately fades away.

For desktop computing, the minimum ergonomic standard refresh rate was usually 70Hz. Even that was an eye strain over prolonged usage. I remember using 85Hz on my 1990s Eizo CRT.

  • onli
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
So the flickering. Okay :)

Because I remember that, the bloodshot eyes after too long a night of gaming. Just hadn't realised it was because of the screen, partly at least, given that explanation.

  • rob74
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
The culmination of that was the Amiga's high resolution mode (640x400/480) which was interlaced, so only had 25/30 Hz!
Side note that vector displays have yet to be emulated adequately but my mind could be changed by a high quality HDR OLED maybe...
  • rob74
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
Well yeah, that's probably even harder than scaling pixel art - at least both displays work with pixels, but an electron beam drawing a line on a display can't be really well emulated on a raster display - even with ultra-high resolution you will probably still see "jaggies"?
Retrotink isn’t a device. It’s a brand. There are several different scalers made by retrotink.

They’re also not the only high quality scalers made for retro gamers. There’s quite a number of different options available these days.

Your dogmatism is visible from space. Kindly knock it down a few notches please
I’ve never seen good CRT physical emulation. But I also suspect it’s been long enough, that I just wouldn’t be able to tell the difference unless I had my old childhood bedroom Sanyo CRT to compare it to.

I’m not sure these come close because there’s some sort of physical element that would be hard to replicate unless you mapped the DPI of a screen to the “DPI” of a CRT.

Otherwise you’re just creating a weird facsimile in the same way that a lot of indie artists don’t produce pixel art that is actually pixel aligned. It’s ugly.

> there’s some sort of physical element that would be hard to replicate

For a truly authentic CRT experience you need a faint smell of ozone, the crackle of a static charge on the screen and a high-pitched screaming/whining noise right on the edge of perception.

Don’t forget the degauss button. TWANG
Spot on. Reading that sentence I can almost feel that static on my skin from when very young me would curiously get way too close to the TV for reasons I no longer remember.

The thunk of turning off my CRT+VHS combo after a late night watching reruns as a tween. Nostalgia is hell of a drug.

When I was a kid, my CRT sometimes switched to a wrong resolution (it got narrower, so squares became slightly rectangular, for example). I say "my CRT", because that was a hardware, not software issue. I know, because kid-me solution was to smash the (hard, brick) wall with that CRT. And it worked. I still don't know why, I was too young to investigate - and hey it worked so why bother.

My parents were less impressed, when after a few years the screen was moved and the wall was scratched everywhere.

And the very physical experience of carrying it around.
> I’ve never seen good CRT physical emulation.

Same. Because they all try way too hard.

I have a fully working vintage arcade cab from the mid eighties which I still play on. I know. Most of these shaders and techniques exaggerate way too much what things really looked like. There's a tiny blur and there are tiny scanlines (or whatever these little black lines are called) but things... Mostly looks pixelated.

And that's an old, used, CRT I have: probably one of the blurriest one. Back in the nineties we already had fancy Sony Trinitron CRTs and these were flawless. Pixels just looked like pixels, not like all these blurred things nor like all these exaggerated shaders. Many CRTs were really crisp.

Do games from the eighties look better on a CRT? Definitely. But it was subtle.

Pixel art is pixel art and it's not pixel art because it was shown on a CRT and suddenly it wouldn't be pixel art anymore because it's shown on a modern monitor.

Things were really just "blocky" and pixelized. That's really how things looked.

Quite tangential, but it is sort of funny that we’re still doing this nostalgic pixel art thing. I mean, no complaints at all, good pixel art looks nice. But the snes came out a long time ago.

I wonder if we will ever get a nostalgic style that emulates all those flash games. Reasonably high resolution components, but only 10 or so pieces per character. Geometric shapes with gradients.

Pixel art is nostalgic for many, but a big reason why it's used in indie games is because it's very easy to animate and look passable.
Yeah a big reason I started doing pixel art back in 2009 was because it enabled me to do lots of trial and error by changing pixels until I got it to look good. It's much harder to do that with more traditional art, because there are way more options. That's not at all to say that pixel art doesn't require skill, but the skill floor is definitely lower.
I thought it was nostalgia, but I see teenagers that love pixel art games, even though the art style is twice as old as they are. The style aged way better than, say, the PS1 era, where most games just don't hold up, and most of the ones that do happened to still use pixel art.

When it comes to old pixel art games though (as opposed to the new ones), it's a matter of accuracy. There's plenty of articles and videos showing how different it is to try to use a naive emulator on a modern, upscaled OLED vs how the very same game looks in a surviving old Trinitron with a SCART cable. If you are looking at, say, old Atari 2600 games, there's no reason to try to pretend to be a Trinitron. But for SNES? Sonic in the Genesis? Reproducing the screen with square, perfect pixels often looks worse.

Still, flash games are getting emulated, and so do Quake-era FPSes. Sometimes we rediscover older gameplay, or more readable art. Other times it's only nostalgia. But pixel art in itself? It's just effective. Modern games just throw away some of the limitations that didn't make the games better: Go look at Sea of Stars. We couldn't have made that game work in a SNES: Too much memory, too wide a palette, more animation we could ever fit in that hardware. And yet, it's a descendent of the old RPGs stylystically, and it looks absolutely fantastic by any standard.

Pixel art was certainly out of fashion for a while, but it came back in the 2010s because a) nostalgia, b) a counter-reaction to soul-destroying AAA game business, and c) the rise of indie games thanks to Steam.
I have zero nostalgia for pixel art. It is its own thing. If you can't recognize that, you must be blind.
Obligatory Xiao Xiao reference: https://www.newgrounds.com/series/xiao-xiao
Have you seen some of the display options offered by the RetroTink scaler? I think some of them look pretty good, but I'm not a hardcore CRT enthusiast, so maybe my standards are just lower than yours :P
Retrotink is a brand, node a device. There’s about half a dozen or so different scalers made by retrotink and many of them have different options.
If you want to emulate a CRT you have to emulate a specific CRT with a specific input. You can't have a general CRT emulation because they all look a bit different.
Shader based CRT emulation works well on 2K+ screens. Much more convincing than the crude scanline emulation with mask images.
I find it amusing that we now obsess over the missing flaws in our pixel images. This is exactly analogous to the vinyl/digital debate.

One way you can tell this nostalgic quest is a little silly is by the fact that new indie pixel art games are mostly excluded from this nitpicking.

I lived through the CRT > LCD transition and the only downside to LCDs at the time was A) resolution interpolation, and B) motion blur. (Both of these issues have since been addressed.)

When CRTs were the norm we were never satisfied with their crispness. We always yearned for more clarity and a smaller dot pitch. When you saw a game displayed on a sharp monitor the improvement was both obvious and somewhat amazing. But now we've finally got what we want in the form of high-resolution LCDs and OLEDs and we're trying hard to find new faults to be fixed, haha.

I am a bit of a hypocrite: I like a good CRT overlay on my retro games. It invokes a feeling. But I won't say it's objectively better.

Brian Eno put it pretty well in 1996

"Whatever you now find weird, ugly, uncomfortable and nasty about a new medium will surely become its signature. CD distortion, the jitteriness of digital video, the crap sound of 8-bit - all of these will be cherished and emulated as soon as they can be avoided. It’s the sound of failure: so much modern art is the sound of things going out of control, of a medium pushing to its limits and breaking apart. The distorted guitar sound is the sound of something too loud for the medium supposed to carry it. The blues singer with the cracked voice is the sound of an emotional cry too powerful for the throat that releases it. The excitement of grainy film, of bleached-out black and white, is the excitement of witnessing events too momentous for the medium assigned to record them."

It also reminds me of that Arcade Fire song about how "we used to wait" for letters to arrive. The novelty isn't just in the thing, it's also in anticipation of the thing that you had to take out of the dust jacket and set up.
I think this article from 1995 sums it up: http://alvyray.com/Memos/CG/Microsoft/6_pixel.pdf

Summation: A pixel is a "picture element," a sampling of the intent.

On CRTs, the phosphors would sample the electron beams, which in turn sampled the memory. The phosphors, when hit, would diffuse in a perfectly round manner. As the voltage and intensity of the beam increases, the rays become more plentiful and the diffusion dilates; the output brightness becomes non-linear. In modern displays, this non-linearity is corrected for with "gamma."

So we have two changes in modern displays that affect the way the picture is presented to the eyes:

1. Square edges. These don't exist with CRTs, barring double-scan and prescaling.

2. Dilation. Pixels of higher brightness on a CRT occupy more area than that allocated for a pixel on an LCD. Brightness bleeds over into neighboring pixels, (importantly) making dark lines finer.

So, objectively, pixel art originally displayed on CRTs needs to be altered to have the same appearance on a LCD. The worst problem I see personally is that a bilinear-filter is often used, but it does the interpolation in gamma-space instead of linear. This causes dark lines and black areas to become more pronounced and blurry. This, coupled with the lack of dilation completely changes the character of the image.

About artistic intent, I can provide an anecdote as a counterexample: Shigeru Miyamoto said early sprites were first laid out on graph paper--as square blocks. There's photos out there, and the blocks are filled in completely and are very square. This was early on, so I don't know if they went back and adjusted them, or if later artists often used the intended display to model their art or not.

  • panzi
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
On a page like this you should really use the CSS style:

    img {
        image-rendering: pixelated;
    }
It's not that simple because the user's devicePixelRatio might be fractional. Say it's 1.5, then scaling up, some low-res pixels get scaled up to N pixels and other to N+1 and you can get something really ugly, especially if the thing you're scaling is a stippled pattern.
  • panzi
  • ·
  • 1 week ago
  • ·
  • [ - ]
Well, I consider that still vastly better for pixel art than it being blurry. If you really want to handle 1.5x resolutions you can scale the image to that resolution yourself in the way you think is best and provide it using srcset.
  • panzi
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
(Because otherwise it's all blurry on a high DPI monitor.)
>> Center: Horizontal linear (proposed)

The image they used is biased toward horizontal.

The ground (and blocks beneath the ground) have strong horizontal lines. As does the fence wall behind the main character, and the main characters gun is horizontally elongated.

"Let’s do an experiment to make the VGA signal horizontal blur visible. I plugged my laptop to an LCD monitor with both HDMI and VGA cables and compared the results. This basically simulates a high quality CRT display and low quality VGA cable."

This is so wrong. CRTs have unique properties that can not really be replicated on an LCD monitor. You can get something similar with a really high refresh OLED panel, but that needs to be verified.

I do not like it. The best aesthetic on a modern display to me is integer scaling towards the nearest multiple. That looks nothing like a CRT and breaks with some assumptions artists made back then, but modern pixel art is designed for modern displays. If you want a CRT look there are sophisticated shaders that look decent on a high res screen, but nothing reproduces an actual CRT.
Yeah, the entire point of modern pixel art is that you want the pixels to be so sharp you could cut yourself with them.
Yeah, I've always prefered crisp pixel art like what you would see on a Gameboy screen.

It's nice to have CRT filters and nonlinear scaling available as an option, but I'll never use them. If the game can't scale up perfectly, then I'd rather just have black bars on the edges than making everything blurry or adding extra pixels where they don't belong.

  • hexo
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
None of the blurred versions looks good to me. I've always preferred the crisp version too.
Did Game Boy really have crisp pixels? It's more of a lattice/grid with both vertical and horizontal "scan" lines. Plus ghosting and bloom.
The GameBoy Pocket had a really shockingly crisp screen with very deep blacks. The original GameBoy was quite blurry, especially when scrolling.
I don't know which Gameboy you played, but the original was blurry as hell as soon as anything moves.
Then they go on to try and compare to an composite output but they do this. "Jazz Jackrabbit 2 through an interlaced composite video stream, grabbed with a cheap capture card." That's not really how we experienced it though. Better to try and take a photograph of a CRT - although that's challenging.
But author did nearest neighbor on both axis instead of just vertical as proposed in tweet. Isn't this different?
It's neat hack, but I'm not sure if there is much use for it. Of course this is much faster than doing CRT emulation, but in what context is that performance difference relevant? And I don't know if there are any other major advantages here.
  • pavon
  • ·
  • 2 weeks ago
  • ·
  • [ - ]
CRT emulation only looks good on 4K monitors or higher. Below that, the effects end up exaggerated because there just aren't enough pixels to implement them at the right scale. Furthermore, if you want to preserve the correct aspect ratio then you will be doing non-integer upscaling in at least one dimension anyway, so if doing integer nearest-neighbor on the vertical and interpolation on the horizontal dimension improves appearance (and it certainly looks better to me) then it is a win for anyone on 1080p.
I wonder if this could be implemented in the original OSSC.
The "hq4x" filter remains the best.
To my eyes it looks pretty bad. Nothing like how the games are supposed to look on a TV or CRT monitor.
Because you picked that metric.

Of course well-working graphic designers of the time measured their effort on the final result in the intended medium,

but of course they were also compromising on the amount of information they could throw in, and there exist (formal and informal) studies and work towards information augmentation, regardless of the medium. Or, because the original medium is largely unavailable and you want past work to look well on current media (so you try to augment information to make old works look better on LCDs and similar).

Its unfortunate that the author is using VGA signals on LCD displays as "retro". I remember well my first experience using a LCD monitor for work. It was for my first "Silicon Valley" job in 1999 and it was a 15" 1024x768 one, perhaps a ViewSonic. The CTO of the company I was working for was pushing them as it was the "new thing". I requested a 19 inch Trinitron instead as the text was blurry with the VGA input and hurt my eyes, where as my Sony at home was noticeably sharper. I continued using CRTs up until probably 2005 (including a 21" Sony that weighed > 100lbs), it was at that time I got a graphics card with DVI output. At that point, I switched to a 20" Dell LCD and never looked back.

tldr; VGA always looked like crap on most LCDs, imho they were almost unusable until DVI.