Tuesday, August 20th 2019
NVIDIA Beats Intel to Integer Scaling, but not on all Cards
NVIDIA, with its GeForce 436.02 Gamescom-special drivers. among several new features and performance updates, introduced integer scaling, a resolution upscaling algorithm that scales up extremely low-resolution visuals to a more eye-pleasing blocky pixellated lines by multiplying pixels in a "nearest-neighbor" pattern without changing their color, as opposed to bilinear upscaling, that blurs the image by attempting to add details where none exist, by altering colors of multiplied pixels.
Intel originally announced an integer upscaler this June that will be exclusive to the company's new Gen11 graphics architecture, since older generations of its iGPUs "lack the hardware requirements" to pull it off. Intel's driver updates that add integer-scaling are set to arrive toward the end of this month, and even when they do, only a tiny fraction of Intel hardware actually benefit from the feature (notebooks and tablets that use "Ice Lake" processors).NVIDIA's integer upscaling feature has been added only to its "Turing" architecture GPUs (both RTX 20-series and GTX 16-series), but not on older generations. NVIDIA explains that this is thanks to a "hardware-accelerated programmable scaling filter" that was introduced with "Turing." Why this is a big deal? The gaming community has a newfound love for retro games from the 80's thru 90's, with the growing popularity of emulators and old games being played through DOSBox. Many small indie game studios are responding to this craze with hundreds of new titles taking a neo-retro pixellated aesthetic (eg: "Fez").
When scaled up to today's high-resolution displays, many of these games look washed out thanks to bilinear upscaling by Windows. This calls for something like integer upscaling. We're just surprised that NVIDIA and Intel aren't implementing integer upscaling on unsupported GPUs by leveraging programmable shaders. Shader-based upscaling technologies aren't new. The home-theater community is vastly invested in the development of MadVR, a custom-video renderer that packs several shader-based upscaling algorithms that only need Direct3D 9.0c-compliant programmable shaders.
Update 12:57 UTC: The NVIDIA 436.02 drivers have been released now and are available for download.
Intel originally announced an integer upscaler this June that will be exclusive to the company's new Gen11 graphics architecture, since older generations of its iGPUs "lack the hardware requirements" to pull it off. Intel's driver updates that add integer-scaling are set to arrive toward the end of this month, and even when they do, only a tiny fraction of Intel hardware actually benefit from the feature (notebooks and tablets that use "Ice Lake" processors).NVIDIA's integer upscaling feature has been added only to its "Turing" architecture GPUs (both RTX 20-series and GTX 16-series), but not on older generations. NVIDIA explains that this is thanks to a "hardware-accelerated programmable scaling filter" that was introduced with "Turing." Why this is a big deal? The gaming community has a newfound love for retro games from the 80's thru 90's, with the growing popularity of emulators and old games being played through DOSBox. Many small indie game studios are responding to this craze with hundreds of new titles taking a neo-retro pixellated aesthetic (eg: "Fez").
When scaled up to today's high-resolution displays, many of these games look washed out thanks to bilinear upscaling by Windows. This calls for something like integer upscaling. We're just surprised that NVIDIA and Intel aren't implementing integer upscaling on unsupported GPUs by leveraging programmable shaders. Shader-based upscaling technologies aren't new. The home-theater community is vastly invested in the development of MadVR, a custom-video renderer that packs several shader-based upscaling algorithms that only need Direct3D 9.0c-compliant programmable shaders.
Update 12:57 UTC: The NVIDIA 436.02 drivers have been released now and are available for download.
54 Comments on NVIDIA Beats Intel to Integer Scaling, but not on all Cards
This must be because Intel and AMD plan to support interger scaling.
On the other hand, maybe it was a good idea to skip the 20XX series.
Cuz with Turing you get RTX and Iteger Scaling for the price of one.
So much entitlement.
Great new's dont get me wrong but one more time I feel like Nvidia has been sandbagging this and many other features unless the competition offers them. And one more time locking it for Turing, still trying to rise those low RTX sales...
Is it illegal ?! no but makes them kind of douches. :rolleyes:
store.steampowered.com/app/993090/Lossless_Scaling/
tanalin.com/en/articles/lossless-scaling/
So when is AMD bringing on the integer scaling support?
www.amd.com/en/technologies/radeon-software-image-sharpening
Yep only for Navi, somehow I didn't hear any outcry from the hypocrites from red camp. Does that make AMD kind of douch? By the red fans' standard I guess yes.
And no, there is no red or green team, tech improvement is always great for the consumer.
Niche application from niche hardware. Intel still dominates the video adapter space by population.
oh right, that's literally an nvidia "trade secret" making what you just stated nearly impossible. Sounds fun. Integer scaling is not hard. For god's sake man, the cpu could do it.