Saturday, July 13th 2019

Intel adds Integer Scaling support to their Graphics lineup

Intel's Lisa Pearce today announced on Twitter, that the company has listened to user feedback from Reddit and will add nearest neighbor integer scaling to their future graphics chips. Integer scaling is the holy grail for gamers using console emulators, because it will give them the ability to simply double/triple or quadruple existing pixels, without any loss in sharpness that is inherent to traditional upscaling algorithms like bilinear or bicubic. This approach also avoids ringing artifacts that come with other, more advanced, scaling methods.

In her Twitter video, Lisa explained that this feature will only be available on upcoming Gen 11 graphics and beyond - previous GPUs lack the hardware required for implementing integer scaling. In terms of timeline, she mentioned that this will be part of the driver "around end of August", which also puts some constraints of the launch date of Gen 11, which seems to be rather sooner than later, based on that statement.

It is unclear at this time, whether the scaling method is truly "integer" or simply "nearest neighbor". While "integer scaling" is nearest neighbor at its core, i.e. it picks the closest pixel color and does no interpolation, the difference is that "integer scaling" uses only integer scale factors. For example, Zelda Breath of the Wild runs at 900p natively, which would require a 2.4x scaling factor for a 4K display. Integer scaling would use a scale factor of x2, resulting in a 1800p image, with black borders on top - this is what the gamers want. The nearest neighbor image would not have the black bars, but every second pixel would be tripled instead of doubled, to achieve the 2.4x scaling factor, but resulting in a sub-optimal presentation.

Update Jul 13: Intel has posted an extensive FAQ on their website, which outlines the details of their Integer Scaling implementation, and we can confirm that it is done correctly - the screenshots clearly show black borders all around the upscaled image, which is exactly what you would expect for scaling with integer scale factors. Intel does provide two modes, called "NN" (Nearest Neighbor) and "IS" (Integer Scaling).
Will Intel implement pure integer scaling with borders?

Yes, the driver being released in late August will provide users with the option to force integer scaling. The IS option will restrict scaling of game images to the greatest possible integer multiplier. The remaining screen area will be occupied by a black border, as mentioned earlier.
Sources: Twitter, FAQ on Intel Website
Add your own comment

56 Comments on Intel adds Integer Scaling support to their Graphics lineup

#1
Flaky
Hope nvidia and amd will follow. This should have been here ages ago.
Posted on Reply
#2
Vayra86
This is pretty cool, finally an end to blurry upscale.

From INTEL no less. Damn
Posted on Reply
#3
Crackong
Oh it is Lisa......wait it is Intel.
Posted on Reply
#4
wurschti
Thank you!
AMD and NVIDIA, you're next!
Posted on Reply
#5
geon2k2
This would be fantastic for people with 4k monitors which want to game in 1080p.
They will finally have up scaling which does not affect sharpness.
Posted on Reply
#6
bug
I find it funny this has been blocked by lack of hardware support. Of all the scaling methods, this is the one that barely makes an impact if you implement it in software only.
Posted on Reply
#7
W1zzard
bugI find it funny this has been blocked by lack of hardware support. Of all the scaling methods, this is the one that barely makes an impact if you implement in in software only.
Yeah I don't buy the "implementing it on older generations would be a hack" argument either
Posted on Reply
#8
lexluthermiester
W1zzard
I actually prefer Bilinear and Trilinear "scaling" filters as they give a softer blending effect. I've never been a fan of the sharp-edge pixel look. But I digress...
Posted on Reply
#9
W1zzard
lexluthermiesterTrilinear
that only works if you have multiple mipmapped images , which isn't the case here
Posted on Reply
#10
bug
lexluthermiesterI actually prefer Bilinear and Trilinear "scaling" filters as they give a softer blending effect. I've never been a fan of the sharp-edge pixel look. But I digress...
I believe this is catering to those that prefer the original, blocky look. It can also be a bonus when eyesight starts failing us ;)
Posted on Reply
#11
lexluthermiester
W1zzardthat only works if you have multiple mipmapped images , which isn't the case here
Good point.
bugI believe this is catering to those that prefer the original, blocky look.
Thing is, we never had that BITD. Because of the way TV CRT's worked, there was always the smoothing/blurring/blending effect caused by the way the electron gun beams scanned through the color masks and produced an image. Anyone who says they remember the "blocky" look is kinda fooling themselves because it just never happened that way. I'm not saying at all that it's wrong to prefer that look, just that we never had it back then because of the physical limitations of the display technology of the time.

So this Integer Scaling thing, while useful to some, isn't all that authentic for many applications.
Posted on Reply
#12
bug
lexluthermiesterGood point.

Thing is, we never had that BITD. Because of the way TV CRT's worked, there was always the smoothing/blurring/blending effect caused by the way the electron gun beams scanned through the color masks and produced an image. Anyone who says they remember the "blocky" look is kinda fooling themselves because it just never happened that way. I'm not saying at all that it's wrong to prefer that look, just that we never had it back then because of the physical limitations of the display technology of the time.

So this Integer Scaling thing, while useful to some, isn't all that authentic for many applications.
It's not entirely authentic, but it's as close to the original as possible.
When instead of one pixel you display 9 or 16, any interpolation method will be way softer than the original.
Like you said, you prefer a softer look. It's just that others prefer it the other way around (imagine that :D )
Posted on Reply
#13
lexluthermiester
bugWhen instead of one pixel you display 9 or 16, any interpolation method will be way softer than the original.
It's a bit more complicated than that, but yes.
bugLike you said, you prefer a softer look. It's just that others prefer it the other way around (imagine that :D )
I understand this of course. Just saying that such is not authentic due to the way the display technology of the time produced images, whether watching TV broadcasts, VHS/Laserdisc movies, video games or even early computers.
Posted on Reply
#14
InVasMani
I just want Intel release a dedicated post processing card based around FPGA's where you can easily toggle and reconfigure the hardware to maximize performance a bit like UAD DSP's for audio. Just have it post process a incoming HDMI/Display port signal and output to the display w/o adding more than like 1-4ms latency I'd be happy if it can do a lot of post process effects like reshade w/o impacting performance negatively in the process. I think the one thing Intel really needs the push and emphasis is taking advantage of it's FPGA tech both for it's CPU's and GPU's bundling in a little bit on both of those two things could go a long way I'd think. I mean eventually some of the FPGA duties could moved to more fixed hardware ASIC functionality, but FPGA's are flexible and having a bit of that can be nice.
Posted on Reply
#15
XiGMAKiD
Definitely a nice feature to have, now leak some juicy info about your dGPU please
Posted on Reply
#16
dicktracy
It will finally be ideal to buy 8k TVs and run it at lower resolution for games without that fugly blur while reserving that glorious 8k resolution for actual PC work.
Posted on Reply
#17
yotano211
I dont want to see Lisa Pearce on a 8k TV, I want to see her is real life. She's looking nice.
Posted on Reply
#18
rtwjunkie
PC Gaming Enthusiast
fynxerHaha, time to make an update, she is probably upwards 10-15 years younger in here profile picture.

Just undermines her credibility as a person, when working a high profile job, not to represent her real self.

Both pictures are on her Twitter Page. What does her looks matter? She is paid for her brain.
Posted on Reply
#21
Camm
XuperAmd/comments/b5r2xy
AMD didn't bother , perhaps they think it's not worth.
I'm not quite getting your comment, its an option in their future roadmap poll, so they must see some merit in implementing it?
Posted on Reply
#22
biboif
Yeah now if only I could run CEMU on intel graphics
Posted on Reply
#23
Blueberries
I'd like to scale her integer if you catch my drift
Posted on Reply
#25
Recus
I checked Zelda Breath of the Wild Cemu 4k videos and they don't look blurry at all. Also you can always use ESRGAN.
Posted on Reply
Add your own comment
Dec 22nd, 2024 01:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts