I just find it puzzling how this is even news.
While it could be interesting to be able to enable this without application support, implementing this in an application has been possible "forever".
1) You just render your game to a framebuffer of the desired resolution, let's say 320x200.
2) Make another one to fill the screen, let's say 2560x1440.
3) Do a couple of lines of code to calculate your integer scaling, in this case 7.
4) Render the first framebuffer to a centered quad(2240x1400) with nearest neighbour texture filtering,
Super easy.
I find it funny this has been blocked by lack of hardware support. Of all the scaling methods, this is the one that barely makes an impact if you implement it in software only.
I don't understand how this has been lacking in hardware support, it has been part of the
OpenGL specification as long as I can remember. But perhaps Intel always implemented it in software?
I actually prefer Bilinear and Trilinear "scaling" filters as they give a softer blending effect. I've never been a fan of the sharp-edge pixel look. But I digress...
Trilinear will only make things even more blurred.
Thing is, we never had that BITD. Because of the way TV CRT's worked, there was always the smoothing/blurring/blending effect caused by the way the electron gun beams scanned through the color masks and produced an image. Anyone who says they remember the "blocky" look is kinda fooling themselves because it just never happened that way. I'm not saying at all that it's wrong to prefer that look, just that we never had it back then because of the physical limitations of the display technology of the time.
So this Integer Scaling thing, while useful to some, isn't all that authentic for many applications.
Integer scaling is better than a blurred stretched image, but as you are saying it's not a correct representation of how graphics designed for CRTs looked. It's actually one of the things that has annoyed med with the retro indie gaming trend over the past years, those who made them have either forgot how old games looked, or have only seen them in emulators. I usually call it "fake nostalgia", since most people have a misconception about how old games looked.
CRTs displayed pixels very different from modern LCD, plasma and OLED displays. There were two main types of CRTs; shadow mask and aperture grille. Shadow masks were the worst in terms of picture quality, and used up to many small dots to make a single "pixel", but had the advantage of "dynamic" resolutions. Aperture grille (Trinitron/dynatron as you probably know it) had grids of locked pixels and a much sharper picture. But even for these, pixels blended slightly together. One of the "wonderful" things of CRTs were how the pixels slightly bled in the edges, causing a slight blurring effect, but only on the pixel edges for Trinitrons. So the pixels appeared fairly sharp, not blurred like if you scale a picture in Photoshop.
Additionally, if we're talking about console games, CRT TVs didn't even have square pixels. I don't know the exact ratio, but somewhere close to 5:4 or 4.5:4. So if you want to emulate NES authentically, it should be slightly stretched.
While CRTs have a very good color range, but the precision were not so good. But many graphics artists exploited this to create gradients and backgrounds using dithering:
(click the thumbnail)
On most CRTs this would look softer or even completely smooth.
And then there is the shade between the scanlines, but this varied from CRT to CRT. There are some emulators which emulate this effect.