NVIDIA, with its GeForce 436.02 Gamescom-special drivers. among several new features and performance updates, introduced integer scaling, a resolution upscaling algorithm that scales up extremely low-resolution visuals to a more eye-pleasing blocky pixellated lines by multiplying pixels in a "nearest-neighbor" pattern without changing their color, as opposed to bilinear upscaling, that blurs the image by attempting to add details where none exist, by altering colors of multiplied pixels.
Intel
originally announced an integer upscaler this June that will be exclusive to the company's new Gen11 graphics architecture, since older generations of its iGPUs "lack the hardware requirements" to pull it off. Intel's driver updates that add integer-scaling are set to arrive toward the end of this month, and even when they do, only a tiny fraction of Intel hardware actually benefit from the feature (notebooks and tablets that use "Ice Lake" processors).

NVIDIA's integer upscaling feature has been added only to its "Turing" architecture GPUs (both RTX 20-series and GTX 16-series), but not on older generations. NVIDIA explains that this is thanks to a "hardware-accelerated programmable scaling filter" that was introduced with "Turing." Why this is a big deal? The gaming community has a newfound love for retro games from the 80's thru 90's, with the growing popularity of emulators and old games being played through DOSBox. Many small indie game studios are responding to this craze with hundreds of new titles taking a neo-retro pixellated aesthetic (eg: "Fez").
When scaled up to today's high-resolution displays, many of these games look washed out thanks to bilinear upscaling by Windows. This calls for something like integer upscaling. We're just surprised that NVIDIA and Intel aren't implementing integer upscaling on unsupported GPUs by leveraging programmable shaders. Shader-based upscaling technologies aren't new. The home-theater community is vastly invested in the development of MadVR, a custom-video renderer that packs several shader-based upscaling algorithms that only need Direct3D 9.0c-compliant programmable shaders.
Update 12:57 UTC: The NVIDIA 436.02 drivers have been released now and are
available for download.
54 Comments on NVIDIA Beats Intel to Integer Scaling, but not on all Cards
Oh wait, it actually ISNT that easy, thats why GPU makers implementing it is a big deal.
Again, if it isnt that hard why dont you write a simple program to do this automatically and make big bucks? And what do boost clocks have to do with integer scaling? I mean, Nvidia does provide an API to mess with GPU clock rates, thats what the likes of precision X and afterburner hook into.
Intercepting display output can't just be done in an app, however.
TV's own scaler = interpolation blur
Nvidia's GPU scaler = interpolation blur.
There's no way that disabling interpolation requires Turing; Nvidia are just being d*ckholes because they're first.
www.techpowerup.com/forums/threads/intel-adds-integer-scaling-support-to-their-graphics-lineup.256801/
I'm criticising you right now for even bringing it up; It adds nothing of value to the discussion. Why does Intel's announcement make me wrong? You're wrong! Intel products can't scale yet and the Turing card I have in my machine is already doing it.
That's first, by any definition - except yours, apparently.... :kookoo:
I was using software integer scaling in Z-Snes and KGen emulators over 20 years ago. It's an incredibly simple post-processing filter and emulators of the late '90s even allowed you to write your own custom scaling filters; Making an integer scaler was a single short line of code and often used as the first step in other more complex filters. Intel may have announced it, but they sure as hell weren't first either. It's a concept that existed 25 years ago and fell out of mainstream use.
You should also look up the definition of the word fanboyism if you're going to start throwing it around, though - because trust me, you're using it wrong.
tanalin.com/en/projects/integer-scaler/ (freeware 500kb .exe & very small memory footprint ~ 2MiB)
store.steampowered.com/app/993090/Lossless_Scaling/ (didn't buy, it's 3 bucks, and for that it allows you to choose scale factor 1x,2x,3x,4x,... - also you can apply anti-aliasing on the losslessly scaled image)
These are just two I know of. There's possibly some other Freeware or Open-Source stuff floating around on github that achieves the exact same.
It's not some kind of miracle software.
Everyone who wants this functionality, because he is an eccentric oldschool gamer can already use it without much effort.
It's all factitious commotion.
AMD improved the performance with the tesselation these last years on Polaris and vegas but their cards still struggle.
Back on topic, been doing some reading on the way that Polaris, Vega and Navi do scaling in general and it seems like it would be a trivial effort to include integer scaling in driver. In fact, the same can be said for NVidia all the way back to Tesla(GTX2xx). Why it hasn't been implemented thus far on both sides could simply boil down to a lack of demand coupled with the minimal resource cost of doing that function in software because of how well known the coding is, how efficient it is to code and that is relatively very simple.
I like Integer scaling so far!
As for the definition of fanboy:
- Fan = loyal to a team/brand, zealous, irrationally devoted.
- Boy = child, immature
I attacked your preferred team/brand for making a BS excuse and you start throwing childish insults to defend them for no rational reason? I think I've found the fanboy, by definition.As I mentioned earlier, I criticise who I please - Nvidia, Intel, and AMD alike, because BS needs to be called out rather than excused or endorsed - regardless of who it is. Anyway, I'm out. I don't really have anything more to add and this isn't valuable to the scaler discussion.
I would question whether this type of feature belongs in a driver though, not because I don't see the usefulness of this feature, but because I fear this will be yet another poorly maintained driver feature, or that other features will suffer instead. I think drivers already contain way too much that don't belong there, and all the manufacturers struggle with maintaining the feature set they already provide, especially AMD who to this date don't offer reliable OpenGL support.
Secondly, I would question whether integer scaling is the right solution to begin with. While multiplying pixel sizes to display them sharply on modern LCD/OLED displays is better than blurring, it will make them more blocky than they ever were on CRTs. Various emulators, console replicas and external scalers(like the Framemeister) already offer more options than just integer scaling, like the shape of the pixels, spacing between scanlines, etc. So if the goal of this integer scaling is to run old games and relpicate the "CRT look", then perhaps we need a even better solution than this?
As I've mentioned, doing this on the driver side does give some benefits, but it will also require a robust interface to control it, even though the basic integer scaling is super simple to implement.
Applications can of course do whatever they want.
A lot of TVs/monitors get it really, really wrong.
In fairness to R-T-B, I like the idea of a better-than-bilinear option too.
Back in the early emulation days, a popular filter for 2x scaling (so appropriate for 1080p on a 4k display) was called SAI - shown on the right compared to integer scaling on the left.
Anyway, the benefit to the blocky pixel look is that text is sharper and easier to read than it would be if it were interpolated, so it's more than just games and emulation.
This is especially relevant on high-dpi displays such as 4K TVs and monitors because the alternative is Windows DPI scaling which is still very much a mixed bag in terms of quality and consistency, especially when only half of the Windows 10 interface and even fewer applications actually dpi-scale gracefully. Even if DPI scaling looks better, the jarring contrast between one application rendering text at 4K native with subpixel AA (Cleartype) alongside another application running at 1080p and bilnear filtering makes it look way worse than if the whole image was scaled using the same method.
Integer scaling faces one additional issue on the desktop, and that's subpixel AA.
With a typical RGB stripe, the horizonal resolution is tripled by using colour channels as effective edges - look up cleartype if that's not something you're already familiar with - and integer scaling is going to fail on that because the physical RGB stripe doesn't integer scale.
1080p on 4K monitors will only work well for integer scaling when subpixel AA is used if the physical stripe went RRGGBB for each pair of horizontal pixels instead of RGBRGB. The easy option is to just disable subpixel AA but a smarter option would be for the graphics driver to be aware of subpixel AA and the physical stripe layout, so that it could adjust rendered RGB subpixel AA output to a native RGBRGB physical pixel (using 2x integer scaling as an example there).
The ZSNES HQX option (above) would seem to be really well-suited to desktop UIs - certainly better than integer scaling (which some prefer to the default bilinear scaling).
If a GPU driver gave me an HQX option, I'd switch to it immediately - I'm just not sure what the GPU overhead would be, but you save so much by dropping from 4K rendering to 1080p rendering that I don't think it would matter on any GPU.