Saturday, July 13th 2019
Intel adds Integer Scaling support to their Graphics lineup
Intel's Lisa Pearce today announced on Twitter, that the company has listened to user feedback from Reddit and will add nearest neighbor integer scaling to their future graphics chips. Integer scaling is the holy grail for gamers using console emulators, because it will give them the ability to simply double/triple or quadruple existing pixels, without any loss in sharpness that is inherent to traditional upscaling algorithms like bilinear or bicubic. This approach also avoids ringing artifacts that come with other, more advanced, scaling methods.
In her Twitter video, Lisa explained that this feature will only be available on upcoming Gen 11 graphics and beyond - previous GPUs lack the hardware required for implementing integer scaling. In terms of timeline, she mentioned that this will be part of the driver "around end of August", which also puts some constraints of the launch date of Gen 11, which seems to be rather sooner than later, based on that statement.It is unclear at this time, whether the scaling method is truly "integer" or simply "nearest neighbor". While "integer scaling" is nearest neighbor at its core, i.e. it picks the closest pixel color and does no interpolation, the difference is that "integer scaling" uses only integer scale factors. For example, Zelda Breath of the Wild runs at 900p natively, which would require a 2.4x scaling factor for a 4K display. Integer scaling would use a scale factor of x2, resulting in a 1800p image, with black borders on top - this is what the gamers want. The nearest neighbor image would not have the black bars, but every second pixel would be tripled instead of doubled, to achieve the 2.4x scaling factor, but resulting in a sub-optimal presentation.
Update Jul 13: Intel has posted an extensive FAQ on their website, which outlines the details of their Integer Scaling implementation, and we can confirm that it is done correctly - the screenshots clearly show black borders all around the upscaled image, which is exactly what you would expect for scaling with integer scale factors. Intel does provide two modes, called "NN" (Nearest Neighbor) and "IS" (Integer Scaling).
Sources:
Twitter, FAQ on Intel Website
In her Twitter video, Lisa explained that this feature will only be available on upcoming Gen 11 graphics and beyond - previous GPUs lack the hardware required for implementing integer scaling. In terms of timeline, she mentioned that this will be part of the driver "around end of August", which also puts some constraints of the launch date of Gen 11, which seems to be rather sooner than later, based on that statement.It is unclear at this time, whether the scaling method is truly "integer" or simply "nearest neighbor". While "integer scaling" is nearest neighbor at its core, i.e. it picks the closest pixel color and does no interpolation, the difference is that "integer scaling" uses only integer scale factors. For example, Zelda Breath of the Wild runs at 900p natively, which would require a 2.4x scaling factor for a 4K display. Integer scaling would use a scale factor of x2, resulting in a 1800p image, with black borders on top - this is what the gamers want. The nearest neighbor image would not have the black bars, but every second pixel would be tripled instead of doubled, to achieve the 2.4x scaling factor, but resulting in a sub-optimal presentation.
Update Jul 13: Intel has posted an extensive FAQ on their website, which outlines the details of their Integer Scaling implementation, and we can confirm that it is done correctly - the screenshots clearly show black borders all around the upscaled image, which is exactly what you would expect for scaling with integer scale factors. Intel does provide two modes, called "NN" (Nearest Neighbor) and "IS" (Integer Scaling).
Will Intel implement pure integer scaling with borders?
Yes, the driver being released in late August will provide users with the option to force integer scaling. The IS option will restrict scaling of game images to the greatest possible integer multiplier. The remaining screen area will be occupied by a black border, as mentioned earlier.
56 Comments on Intel adds Integer Scaling support to their Graphics lineup
I mean if your graphics card cannot drive 4K smoothly (I don't know, even the top of the line ones can struggle), it's gonna look much much worse than one that can scale 1080P to 4K perfectly.
Even 1080P movie contents will benefit from it. It's like once you see it you cannot go back kind of difference.
The thing is in order to be pixel perfect, the signal has to be formatted correctly right from the source (in this case the video card).
It isn't as simple as you guessed buddy... the difference is literally night and day, it will make 1080P games look clear and vibrant like 4K games on a 4K TV, or else there won't be so many users requesting it so much for YEARS.
Anyhow, I think it is better done within the video card though, so you can output your 1080P games/videos in pixel perfect 4K signal to any 4K TV, old or new.
Maybe streaming over HDMI isn't as lossless as we think? so you need more specific instructions for pixel perfect alignment?
Or maybe it's just a consensus of the industry (display, media transport, video card makers..etc.) so that they can keep forcing people to upgrade even if not everyone needs it?
I really don't know, but I just know I really want Integer Scaling.
Or at least I really hope for that cause Im bit fed of AMD-nVidia having nearly cartel deal.
It's a nice option to have around, but I'm not sure it'll see much use. And I really don't see Nvidia and AMD needing it, considering how they can be either overpowered for applications that benefit from this or counter-productive (with AA) to have it with the ones used on their cards. Which is why I'm wondering why hardware support for it should be news to console emulators, many already catering to the pixel-perfect crowd (assuming I'm not misunderstanding the original post :| ).
While it could be interesting to be able to enable this without application support, implementing this in an application has been possible "forever".
1) You just render your game to a framebuffer of the desired resolution, let's say 320x200.
2) Make another one to fill the screen, let's say 2560x1440.
3) Do a couple of lines of code to calculate your integer scaling, in this case 7.
4) Render the first framebuffer to a centered quad(2240x1400) with nearest neighbour texture filtering,
Super easy. I don't understand how this has been lacking in hardware support, it has been part of the OpenGL specification as long as I can remember. But perhaps Intel always implemented it in software? Trilinear will only make things even more blurred. Integer scaling is better than a blurred stretched image, but as you are saying it's not a correct representation of how graphics designed for CRTs looked. It's actually one of the things that has annoyed med with the retro indie gaming trend over the past years, those who made them have either forgot how old games looked, or have only seen them in emulators. I usually call it "fake nostalgia", since most people have a misconception about how old games looked.
CRTs displayed pixels very different from modern LCD, plasma and OLED displays. There were two main types of CRTs; shadow mask and aperture grille. Shadow masks were the worst in terms of picture quality, and used up to many small dots to make a single "pixel", but had the advantage of "dynamic" resolutions. Aperture grille (Trinitron/dynatron as you probably know it) had grids of locked pixels and a much sharper picture. But even for these, pixels blended slightly together. One of the "wonderful" things of CRTs were how the pixels slightly bled in the edges, causing a slight blurring effect, but only on the pixel edges for Trinitrons. So the pixels appeared fairly sharp, not blurred like if you scale a picture in Photoshop.
Additionally, if we're talking about console games, CRT TVs didn't even have square pixels. I don't know the exact ratio, but somewhere close to 5:4 or 4.5:4. So if you want to emulate NES authentically, it should be slightly stretched.
While CRTs have a very good color range, but the precision were not so good. But many graphics artists exploited this to create gradients and backgrounds using dithering:
(click the thumbnail)
On most CRTs this would look softer or even completely smooth.
And then there is the shade between the scanlines, but this varied from CRT to CRT. There are some emulators which emulate this effect.