Users with the RTX 2060, for example, can't even use DLSS at 4K and, more egregiously, owners of the RTX 2080 and 2080 Ti can not enjoy RTX and DLSS simultaneously at the most popular in-game resolution of 1920x1080, which would be useful to reach high FPS rates on 144 Hz monitors. Battlefield V has a similar, and yet even more divided system wherein the gaming flagship RTX 2080 Ti can not be used with RTX and DLSS at even 1440p, as seen in the second image below.
From my perspective, "2060" and "4k" should not ever be used in the same sentence ... same for "2080 Ti and 1080p"; is there a game in the test suite where a manually OC'd 2080 Ti can't do 100+ fps. I really can't see someone springing for well over $1,000 for a 2080 Ti ($1,300 for an AIB A series) and using it with a $250 144 Hz monitor . Yes it's the most popular resosultion and it's typicall used with the most popular cards which are in the same budget range. The 3 most popular are ... NVIDIA GeForce GTX 1060, NVIDIA GeForce GTX 1050 Ti and NVIDIA GeForce GTX 1050. I'm using a 144 Hz monitor .... but turning on MBR drops that to 120. Are we really imagining an instance where someone lays out $1,300 for an AIB 2080 Ti and pairs it witha $250 monitor ? To my eyes, that's like complaining that your new $165,00, 750 HP sports car does not have an "Eco mode"
Exactly, nvidia is pushing tech further, they can do this now simce AMD is 2 years behind and only 30% market share. Next gen of geforces on 7nm will bring perf we desire and tech will advance even further so we will again desire for more. AMD and their consoles are stagnant.
I think that market share estimate is a bit generous. Market Share for nVidia over recent years has been reported at 70 - 80 % so AMD it's oft assumed that AMD has the rest ... but Intel is closing in on 11% leaving AMD with just 15% but it's been inching up in recent months about 0.1% which is a good sign. If we take Intel out of the equation and just focus on discrete cards ... It's about to 83% to 17% at this time.
The biggest gainers among the top 25 in the last month according to Steam HW Survey were (by order of cards out there): 4th place 1070 (+0.18%), entire R7 series (+0.19%), 21st place RX 580 (+ 0.15%) and 24th place GTX 650 (+0.15%) ... Biggest losers were the 1st place 1060 (-0.52%) 14th place GTX 950 with - 0.19%/ The 2070 doubled it's market share to 0.33 % ... and the 2080 is up 50% to 0.31% share which kinda surprised me. The RX Vega (includes combined Vega 3, Vega 6, Vega 8, RX Vega 10, RX Vega 11, RX Vega 56, RX Vega 64, RX Vega 64 Liquid, and apparently, Radeon VII) made a nice 1st showing at 0.16%. Also interesting that the once dominant 970 will likely drop below 3% in next month.
Arguing in favor of a technology that doesn't apply to more than 5% of the PCs (and I'm being generous), only confirms what
@notb said.I've said it even before DX12 was released: just because there's a lower level alternative, doesn't mean everyone will be taking advantage of that. Because going lower level is not a universal solution (otherwise everything would have been written in C or ASM). But this is going to be a problem when the higher level (i.e. DX11) goes the way of the dodo.
I thot about that for a bit. If we use 5% as the cutoff for discussion, then all we can talk about is technology that shows its benefits for:
1920 x 1080 = 60.48%
1366 x 768 = 14.02%
Even 2560 x 1440 is in use by only 3.97% .... 2160p is completely off the table as it is used by only 1.48 %. But don't we all want to "move up" at some point in the near future ?
The same arguments were used when the automobile arrived, unreliable, will never replace the horse ! .... and most other technologies. I'm old enough to remember when it was said "Bah, who would ever actually buy a color TV ?" Technology advances much like human development, "walking is stoopid, all I gotta do is whine and momma will carry me ... " . I sucked at baseball my 1st year; I got better (a little). I sucked at football my 1st year (got better each year I played). I sucked at basketball my 1st year, was pretty good by college. Technology advances slowly, we find what works and then take it as far as it will go ... eventually, our needs outgrow the limits of the tech you in use and you need new tech. Where's Edison's carbon filament today ? When any tech arrives, in its early iterations, expect it to be less efficient, less cost effective but it has room to grow. Look at IPS ... when folks started thinking "Ooh IPS has more accurate color, let's use it for gaming" ... turned out it wasn't a good idea by any stretch of the imagination.
But over time, the tech advanced, AUoptronics screens came along and we had a brand new gaming experience. Should IPS development have been shut down because less than 5% of folks were using it (at least properly and satisfactoruly) ? My son wanted an IPS screen for his photo work (which he spent $1250 on) thinking it would be OK for gaming ... 4 months later he had a 2nd (TN) monitor as the response time and lag drive him nutz and every time he went into a dark place, he'd get dead cause everyone and everything could see him long before he could see them from the IPS glow. Now, when not on one of those AU screens, feels like I am eating oatmeal but w/o any cinnamon, maple syrup, milk or anything else which provides any semblance of flavor.
But if we're going to say that what is being done by < 5% of gamers doesn't matter, then we are certainly saying that we should not be worrying about a limitation that does not allow a 2080 Ti owner to use a feature at 1080p. That's like buying a $500 tie to wear with a $99 suit